1
0
Fork 0
mirror of https://github.com/DanielnetoDotCom/YouPHPTube synced 2025-10-06 03:50:04 +02:00
Daniel Neto 2023-06-30 08:55:17 -03:00
parent 746e163d01
commit 1c7ea28b46
808 changed files with 316395 additions and 381162 deletions

68
node_modules/hls.js/README.md generated vendored
View file

@ -2,10 +2,11 @@
[![npm](https://img.shields.io/npm/v/hls.js/canary.svg?style=flat)](https://www.npmjs.com/package/hls.js/v/canary)
[![](https://data.jsdelivr.com/v1/package/npm/hls.js/badge?style=rounded)](https://www.jsdelivr.com/package/npm/hls.js)
[![Sauce Test Status](https://saucelabs.com/buildstatus/robwalch)](https://app.saucelabs.com/u/robwalch)
[![jsDeliver](https://data.jsdelivr.com/v1/package/npm/hls.js/badge)](https://www.jsdelivr.com/package/npm/hls.js)
[comment]: <> ([![Sauce Test Status]&#40;https://saucelabs.com/browser-matrix/robwalch.svg&#41;]&#40;https://saucelabs.com/u/robwalch&#41;)
# ![HLS.js](./docs/logo.svg)
# ![HLS.js](https://raw.githubusercontent.com/video-dev/hls.js/master/docs/logo.svg)
HLS.js is a JavaScript library that implements an [HTTP Live Streaming] client.
It relies on [HTML5 video][] and [MediaSource Extensions][] for playback.
@ -18,7 +19,7 @@ HLS.js works directly on top of a standard HTML`<video>` element.
HLS.js is written in [ECMAScript6] (`*.js`) and [TypeScript] (`*.ts`) (strongly typed superset of ES6), and transpiled in ECMAScript5 using [Babel](https://babeljs.io/) and the [TypeScript compiler].
[Webpack] is used to build the distro bundle and serve the local development environment.
[Rollup] is used to build the distro bundle and serve the local development environment.
[html5 video]: https://www.html5rocks.com/en/tutorials/video/basics/
[mediasource extensions]: https://w3c.github.io/media-source/
@ -27,7 +28,7 @@ HLS.js is written in [ECMAScript6] (`*.js`) and [TypeScript] (`*.ts`) (strongly
[ecmascript6]: https://github.com/ericdouglas/ES6-Learning#articles--tutorials
[typescript]: https://www.typescriptlang.org/
[typescript compiler]: https://www.typescriptlang.org/docs/handbook/compiler-options.html
[webpack]: https://webpack.js.org/
[rollup]: https://rollupjs.org/
## Features
@ -46,6 +47,7 @@ HLS.js is written in [ECMAScript6] (`*.js`) and [TypeScript] (`*.ts`) (strongly
- SAMPLE-AES decryption (only supported if using MPEG-2 TS container)
- Encrypted media extensions (EME) support for DRM (digital rights management)
- FairPlay, PlayReady, Widevine CDMs with fmp4 segments
- Level capping based on HTMLMediaElement resolution, dropped-frames, and HDCP-Level
- CEA-608/708 captions
- WebVTT subtitles
- Alternate Audio Track Rendition (Master Playlist with Alternative Audio) for VoD and Live playlists
@ -65,23 +67,28 @@ HLS.js is written in [ECMAScript6] (`*.js`) and [TypeScript] (`*.ts`) (strongly
- Retry mechanism embedded in the library
- Recovery actions can be triggered fix fatal media or network errors
- [Redundant/Failover Playlists](https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html#//apple_ref/doc/uid/TP40008332-CH102-SW22)
- HLS Variable Substitution
### Supported M3U8 tags
### Supported HLS tags
For details on the HLS format and these tags' meanings, see https://tools.ietf.org/html/draft-pantos-hls-rfc8216bis-08
For details on the HLS format and these tags' meanings, see https://datatracker.ietf.org/doc/html/draft-pantos-hls-rfc8216bis
#### Manifest tags
#### Multivariant Playlist tags
- `#EXT-X-STREAM-INF:<attribute-list>`
`<URI>`
- `#EXT-X-MEDIA:<attribute-list>`
- `#EXT-X-SESSION-DATA:<attribute-list>`
- `#EXT-X-SESSION-KEY:<attribute-list>` EME Key-System selection and preloading
- `#EXT-X-START:TIME-OFFSET=<n>`
- `#EXT-X-CONTENT-STEERING:<attribute-list>` Content Steering
- `#EXT-X-DEFINE:<attribute-list>` Variable Substitution (`NAME,VALUE,QUERYPARAM` attributes)
The following properties are added to their respective variants' attribute list but are not implemented in their selection and playback.
- `VIDEO-RANGE` and `HDCP-LEVEL` (See [#2489](https://github.com/video-dev/hls.js/issues/2489))
- `VIDEO-RANGE` (See [#2489](https://github.com/video-dev/hls.js/issues/2489))
#### Playlist tags
#### Media Playlist tags
- `#EXTM3U`
- `#EXT-X-VERSION=<n>`
@ -99,15 +106,20 @@ The following properties are added to their respective variants' attribute list
- `#EXT-X-SERVER-CONTROL:<attribute-list>`
- `#EXT-X-PART-INF:PART-TARGET=<n>`
- `#EXT-X-PART:<attribute-list>`
- `#EXT-X-PRELOAD-HINT:<attribute-list>`
- `#EXT-X-SKIP:<attribute-list>`
- `#EXT-X-SKIP:<attribute-list>` Delta Playlists
- `#EXT-X-RENDITION-REPORT:<attribute-list>`
- `#EXT-X-DATERANGE:<attribute-list>`
- `#EXT-X-DATERANGE:<attribute-list>` Metadata
- `#EXT-X-DEFINE:<attribute-list>` Variable Import and Substitution (`NAME,VALUE,IMPORT,QUERYPARAM` attributes)
- `#EXT-X-GAP` (Skips loading GAP segments and parts. Skips playback of unbuffered program containing only GAP content and no suitable alternates. See [#2940](https://github.com/video-dev/hls.js/issues/2940))
The following tags are added to their respective fragment's attribute list but are not implemented in streaming and playback.
- `#EXT-X-BITRATE` (Not used in ABR controller)
- `#EXT-X-GAP` (Not implemented. See [#2940](https://github.com/video-dev/hls.js/issues/2940))
Parsed but missing feature support
- `#EXT-X-PRELOAD-HINT:<attribute-list>` (See [#5074](https://github.com/video-dev/hls.js/issues/3988))
- #5074
### Not Supported
@ -116,7 +128,6 @@ For a complete list of issues, see ["Top priorities" in the Release Planning and
- Advanced variant selection based on runtime media capabilities (See issues labeled [`media-capabilities`](https://github.com/video-dev/hls.js/labels/media-capabilities))
- HLS Content Steering
- HLS Interstitials
- `#EXT-X-DEFINE` variable substitution
- `#EXT-X-GAP` filling [#2940](https://github.com/video-dev/hls.js/issues/2940)
- `#EXT-X-I-FRAME-STREAM-INF` I-frame Media Playlist files
- `SAMPLE-AES` with fmp4, aac, mp3, vtt... segments (MPEG-2 TS only)
@ -131,6 +142,8 @@ You can safely require this library in Node and **absolutely nothing will happen
## Getting started with development
[![Open in StackBlitz](https://developer.stackblitz.com/img/open_in_stackblitz.svg)](https://stackblitz.com/github/video-dev/hls.js/tree/master?title=HLS.JS)
First, checkout the repository and install the required dependencies
```sh
@ -164,7 +177,7 @@ Only debug-mode artifacts:
npm run build:debug
```
Build and watch (customized dev setups where you'll want to host through another server than webpacks' - for example in a sub-module/project)
Build and watch (customized dev setups where you'll want to host through another server - for example in a sub-module/project)
```
npm run build:watch
@ -178,8 +191,7 @@ npm run build -- --env dist # replace "dist" by other configuration name, see ab
Note: The "demo" config is always built.
**NOTE:** `hls.light.*.js` dist files do not include EME, subtitles, CMCD, or alternate-audio support. In addition,
the following types are not available in the light build:
**NOTE:** `hls.light.*.js` dist files do not include alternate-audio, subtitles, CMCD, EME (DRM), or Variable Substitution support. In addition, the following types are not available in the light build:
- `AudioStreamController`
- `AudioTrackController`
@ -259,8 +271,8 @@ An overview of this project's design, it's modules, events, and error handling c
## API docs and usage guide
- [API and usage docs, with code examples](./docs/API.md)
- [Auto-Generated API Docs (Latest Release)](https://hls-js.netlify.com/api-docs)
- [Auto-Generated API Docs (Development Branch)](https://hls-js-dev.netlify.com/api-docs)
- [Auto-Generated API Docs (Latest Release)](https://hlsjs.video-dev.org/api-docs)
- [Auto-Generated API Docs (Development Branch)](https://hlsjs-dev.video-dev.org/api-docs)
_Note you can access the docs for a particular version using "[https://github.com/video-dev/hls.js/tree/deployments](https://github.com/video-dev/hls.js/tree/deployments)"_
@ -268,18 +280,16 @@ _Note you can access the docs for a particular version using "[https://github.co
### Latest Release
[https://hls-js.netlify.com/demo](https://hls-js.netlify.com/demo)
[https://hlsjs.video-dev.org/demo](https://hlsjs.video-dev.org/demo)
### Master
[https://hls-js-dev.netlify.com/demo](https://hls-js-dev.netlify.com/demo)
[https://hlsjs-dev.video-dev.org/demo](https://hlsjs-dev.video-dev.org/demo)
### Specific Version
Find the commit on [https://github.com/video-dev/hls.js/tree/deployments](https://github.com/video-dev/hls.js/tree/deployments).
[![](https://www.netlify.com/img/global/badges/netlify-color-accent.svg)](https://www.netlify.com)
[![](https://opensource.saucelabs.com/images/opensauce/powered-by-saucelabs-badge-gray.png?sanitize=true)](https://saucelabs.com)
## Compatibility
@ -400,6 +410,10 @@ All HLS resources must be delivered with [CORS headers](https://developer.mozill
Video is controlled through HTML `<video>` element `HTMLVideoElement` methods, events and optional UI controls (`<video controls>`).
## Build a Custom UI
- [Media Chrome](https://github.com/muxinc/media-chrome)
## Player Integration
The following players integrate HLS.js for HLS playback:
@ -424,15 +438,15 @@ The following players integrate HLS.js for HLS playback:
| | | | |
| :----------------------------------------------------------------------------------------------------------------------------------------------------------: | :---------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :------------------------------------------------------------------------------------------------------------------------------------------------------------: | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: |
| [<img src="https://i.cdn.turner.com/adultswim/big/img/global/adultswim.jpg" width="120">](https://www.adultswim.com/streams) | [<img src="https://avatars3.githubusercontent.com/u/5497190?s=200&v=4" width="120">](https://www.akamai.com) | [<img src="https://upload.wikimedia.org/wikipedia/commons/thumb/1/1a/Canal%2B.svg/2000px-Canal%2B.svg.png" width="120">](https://www.canalplus.fr) | [<img src="https://avatars2.githubusercontent.com/u/115313" width="120">](https://www.dailymotion.com) |
| [<img src="https://user-images.githubusercontent.com/4006693/44003595-baff193c-9e8f-11e8-9848-7bb91563499f.png" width="120">](https://freshlive.tv) | [<img src="https://flowplayer.org/media/img/logo-blue.png" width="120">](https://flowplayer.com) | [<img src="https://avatars1.githubusercontent.com/u/12554082?s=240" width="120">](https://www.foxsports.com.au) | [<img src="https://cloud.githubusercontent.com/assets/244265/12556435/dfaceb48-c353-11e5-971b-2c4429725469.png" width="120">](https://www.globo.com) |
| [<img src="https://user-images.githubusercontent.com/4006693/44003595-baff193c-9e8f-11e8-9848-7bb91563499f.png" width="120">](https://freshlive.tv) | [<img src="https://user-images.githubusercontent.com/360826/231535440-7cf075f1-bf38-4640-a0a7-d9ff74a1e396.png" width="120">](https://www.mux.com/) | [<img src="https://avatars1.githubusercontent.com/u/12554082?s=240" width="120">](https://www.foxsports.com.au) | [<img src="https://cloud.githubusercontent.com/assets/244265/12556435/dfaceb48-c353-11e5-971b-2c4429725469.png" width="120">](https://www.globo.com) |
| [<img src="https://images.gunosy.com/logo/gunosy_icon_company_logo.png" width="120">](https://gunosy.com) | [<img src="https://user-images.githubusercontent.com/1480052/35802840-f8e85b8a-0a71-11e8-8eb2-eee323e3f159.png" width="120">](https://www.gl-systemhaus.de/) | [<img src="https://cloud.githubusercontent.com/assets/6525783/20801836/700490de-b7ea-11e6-82bd-e249f91c7bae.jpg" width="120">](https://nettrek.de) | [<img src="https://cloud.githubusercontent.com/assets/244265/12556385/999aa884-c353-11e5-9102-79df54384498.png" width="120">](https://www.nytimes.com/) |
| [<img src="https://cloud.githubusercontent.com/assets/1798553/20356424/ba158574-ac24-11e6-95e1-1ae591b11a0a.png" width="120">](https://www.peer5.com/) | [<img src="https://cloud.githubusercontent.com/assets/4909096/20925062/e26e6fc8-bbb4-11e6-99a5-d4762274a342.png" width="120">](https://www.qbrick.com) | [<img src="https://www.radiantmediaplayer.com/images/radiantmediaplayer-new-logo-640.jpg" width="120">](https://www.radiantmediaplayer.com/) | [<img src="https://www.rts.ch/hummingbird-static/images/logos/logo_marts.svg" width="120">](https://www.rts.ch) |
| [<img src="https://cloud.githubusercontent.com/assets/12702747/19316434/0a3601de-9067-11e6-85e2-936b1cb099a0.png" width="120">](https://www.snapstream.com/) | [<img src="https://pamediagroup.com/wp-content/uploads/2019/05/StreamAMG-Logo-RGB.png" width="120">](https://www.streamamg.com/) | [<img src="https://streamsharkio.sa.metacdn.com/wp-content/uploads/2015/10/streamshark-dark.svg" width="120">](https://streamshark.io/) | [<img src="https://camo.githubusercontent.com/9580f10e9bfa8aa7fba52c5cb447bee0757e33da/68747470733a2f2f7777772e7461626c6f74762e636f6d2f7374617469632f696d616765732f7461626c6f5f6c6f676f2e706e67" width="120">](https://my.tablotv.com/) |
| [<img src="https://user-images.githubusercontent.com/2803310/34083705-349c8fd0-e375-11e7-92a6-5c38509f4936.png" width="120">](https://www.streamroot.io/) | [<img src="https://vignette1.wikia.nocookie.net/tedtalks/images/c/c0/TED_logo.png/revision/20150915192527" width="120">](https://www.ted.com/) | [<img src="https://www.seeklogo.net/wp-content/uploads/2014/12/twitter-logo-vector-download.jpg" width="120">](https://twitter.com/) | [<img src="https://player.clevercast.com/img/clevercast.png" width="120">](https://www.clevercast.com) |
| [<img src="https://player.mtvnservices.com/edge/hosted/Viacom_logo.svg" width="120">](https://www.viacom.com/) | [<img src="https://user-images.githubusercontent.com/1181974/29248959-efabc440-802d-11e7-8050-7c1f4ca6c607.png" width="120">](https://vk.com/) | [<img src="https://avatars0.githubusercontent.com/u/5090060?s=200&v=4" width="120">](https://www.jwplayer.com) | [<img src="https://staticftv-a.akamaihd.net/arches/francetv/default/img/og-image.jpg?20161007" width="120">](https://www.france.tv) |
| [<img src="https://showmax.akamaized.net/e/logo/showmax_black.png" width="120">](https://tech.showmax.com) | [<img src="https://static3.1tv.ru/assets/web/logo-ac67852f1625b338f9d1fb96be089d03557d50bfc5790d5f48dc56799f59dec6.svg" width="120" height="120">](https://www.1tv.ru/) | [<img src="https://user-images.githubusercontent.com/1480052/40482633-c013ebce-5f55-11e8-96d5-b776415de0ce.png" width="120">](https://www.zdf.de) | [<img src="https://github.com/cdnbye/hlsjs-p2p-engine/blob/master/figs/cdnbye.png" width="120">](https://github.com/cdnbye/hlsjs-p2p-engine) |
| [<img src="https://user-images.githubusercontent.com/2803310/34083705-349c8fd0-e375-11e7-92a6-5c38509f4936.png" width="120">](https://www.streamroot.io/) | [<img src="https://user-images.githubusercontent.com/360826/231538721-156a865d-a505-45e7-a362-dafbaf2b182f.png" width="120">](https://www.ted.com/) | [<img src="https://www.seeklogo.net/wp-content/uploads/2014/12/twitter-logo-vector-download.jpg" width="120">](https://twitter.com/) | [<img src="https://player.clevercast.com/img/clevercast.png" width="120">](https://www.clevercast.com) |
| [<img src="https://player.mtvnservices.com/edge/hosted/Viacom_logo.svg" width="120">](https://www.viacom.com/) | [<img src="https://user-images.githubusercontent.com/1181974/29248959-efabc440-802d-11e7-8050-7c1f4ca6c607.png" width="120">](https://vk.com/) | [<img src="https://avatars0.githubusercontent.com/u/5090060?s=200&v=4" width="120">](https://www.jwplayer.com) | [<img src="https://raw.githubusercontent.com/kaltura/kaltura-player-js/master/docs/images/kaltura-logo.svg" width="120">](https://corp.kaltura.com/) |
| [<img src="https://showmax.akamaized.net/e/logo/showmax_black.png" width="120">](https://tech.showmax.com) | [<img src="https://static3.1tv.ru/assets/web/logo-ac67852f1625b338f9d1fb96be089d03557d50bfc5790d5f48dc56799f59dec6.svg" width="120" height="120">](https://www.1tv.ru/) | [<img src="https://user-images.githubusercontent.com/1480052/40482633-c013ebce-5f55-11e8-96d5-b776415de0ce.png" width="120">](https://www.zdf.de) | [<img src="https://cms-static.brid.tv/img/brid-logo-120x120.jpg" width="120">](https://www.brid.tv/) |
| [cdn77](https://streaming.cdn77.com/) | [<img src="https://avatars0.githubusercontent.com/u/7442371?s=200&v=4" width="120">](https://r7.com/) | [<img src="https://raw.githubusercontent.com/Novage/p2p-media-loader/gh-pages/images/p2pml-logo.png" width="120">](https://github.com/Novage/p2p-media-loader) | [<img src="https://avatars3.githubusercontent.com/u/45617200?s=400" width="120">](https://kayosports.com.au) |
| [<img src="https://avatars1.githubusercontent.com/u/5279615?s=400&u=9771a216836c613f1edf4afe71cfc69d4c5657ed&v=4" width="120">](https://flosports.tv) | [<img src="https://www.logolynx.com/images/logolynx/c6/c67a2cb3ad33a82b5518f8ad8f124703.png" width="120">](https://global.axon.com/) | [<img src="https://cms-static.brid.tv/img/brid-logo-120x120.jpg" width="120">](https://www.brid.tv/) | [<img src="https://raw.githubusercontent.com/kaltura/kaltura-player-js/master/docs/images/kaltura-logo.svg" width="120">](https://corp.kaltura.com/) |
| [<img src="https://avatars1.githubusercontent.com/u/5279615?s=400&u=9771a216836c613f1edf4afe71cfc69d4c5657ed&v=4" width="120">](https://flosports.tv) | [<img src="https://www.logolynx.com/images/logolynx/c6/c67a2cb3ad33a82b5518f8ad8f124703.png" width="120">](https://global.axon.com/) | | |
## Chrome/Firefox integration

66577
node_modules/hls.js/dist/hls-demo.js generated vendored

File diff suppressed because it is too large Load diff

File diff suppressed because one or more lines are too long

52149
node_modules/hls.js/dist/hls.js generated vendored

File diff suppressed because it is too large Load diff

605
node_modules/hls.js/dist/hls.js.d.ts generated vendored

File diff suppressed because it is too large Load diff

File diff suppressed because one or more lines are too long

38877
node_modules/hls.js/dist/hls.light.js generated vendored

File diff suppressed because it is too large Load diff

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

134
node_modules/hls.js/package.json generated vendored
View file

@ -1 +1,133 @@
{"name":"hls.js","license":"Apache-2.0","description":"JavaScript HLS client using MediaSourceExtension","homepage":"https://github.com/video-dev/hls.js","authors":"Guillaume du Pontavice <g.du.pontavice@gmail.com>","repository":{"type":"git","url":"https://github.com/video-dev/hls.js"},"bugs":{"url":"https://github.com/video-dev/hls.js/issues"},"main":"./dist/hls.js","types":"./dist/hls.js.d.ts","files":["dist/**/*","src/**/*"],"publishConfig":{"access":"public"},"scripts":{"build":"webpack --progress && npm run build:types","build:ci":"webpack && tsc --build tsconfig-lib.json && api-extractor run","build:debug":"webpack --progress --env debug --env demo","build:watch":"webpack --progress --env debug --env demo --watch","build:types":"tsc --build tsconfig-lib.json && api-extractor run --local","dev":"webpack serve --progress --env debug --env demo --port 8000 --static .","docs":"esdoc","lint":"eslint src/ tests/ --ext .js --ext .ts","lint:fix":"npm run lint -- --fix","lint:quiet":"npm run lint -- --quiet","lint:staged":"lint-staged","prettier":"prettier --write .","prettier:verify":"prettier --check .","pretest":"npm run lint","sanity-check":"npm run lint && npm run prettier:verify && npm run type-check && npm run docs && npm run build:types && npm run build && npm run test:unit","start":"npm run dev","test":"npm run test:unit && npm run test:func","test:unit":"karma start karma.conf.js","test:unit:debug":"DEBUG_UNIT_TESTS=1 karma start karma.conf.js --auto-watch --no-single-run --browsers Chrome","test:unit:watch":"karma start karma.conf.js --auto-watch --no-single-run","test:func":"BABEL_ENV=development mocha --require @babel/register tests/functional/auto/setup.js --timeout 40000 --exit","test:func:light":"BABEL_ENV=development HLSJS_LIGHT=1 mocha --require @babel/register tests/functional/auto/setup.js --timeout 40000 --exit","test:func:sauce":"SAUCE=1 UA=safari OS='OS X 10.15' BABEL_ENV=development mocha --require @babel/register tests/functional/auto/setup.js --timeout 40000 --exit","type-check":"tsc --noEmit","type-check:watch":"npm run type-check -- --watch","prepare":"husky install"},"devDependencies":{"@babel/core":"7.20.12","@babel/helper-module-imports":"7.18.6","@babel/plugin-proposal-class-properties":"7.18.6","@babel/plugin-proposal-object-rest-spread":"7.20.7","@babel/plugin-proposal-optional-chaining":"7.20.7","@babel/plugin-transform-object-assign":"7.18.6","@babel/preset-env":"7.20.2","@babel/preset-typescript":"7.18.6","@babel/register":"7.18.9","@itsjamie/esdoc-cli":"0.5.0","@itsjamie/esdoc-core":"0.5.0","@itsjamie/esdoc-ecmascript-proposal-plugin":"0.5.0","@itsjamie/esdoc-standard-plugin":"0.5.0","@itsjamie/esdoc-typescript-plugin":"0.5.0","@microsoft/api-extractor":"7.33.7","@types/chai":"4.3.4","@types/chart.js":"2.9.37","@types/mocha":"10.0.1","@types/sinon-chai":"3.2.9","@typescript-eslint/eslint-plugin":"5.42.1","@typescript-eslint/parser":"5.42.1","babel-loader":"9.1.2","babel-plugin-transform-remove-console":"6.9.4","chai":"4.3.7","chart.js":"2.9.4","chromedriver":"108.0.0","coverage-istanbul-loader":"3.0.5","eslint":"8.31.0","eslint-config-prettier":"8.6.0","eslint-plugin-import":"2.26.0","eslint-plugin-mocha":"10.1.0","eslint-plugin-node":"11.1.0","eslint-plugin-promise":"6.1.1","eventemitter3":"4.0.7","http-server":"14.1.1","husky":"8.0.3","jsonpack":"1.1.5","karma":"6.4.1","karma-chrome-launcher":"3.1.1","karma-coverage-istanbul-reporter":"3.0.3","karma-mocha":"2.0.1","karma-mocha-reporter":"2.2.5","karma-sinon-chai":"2.0.2","karma-sourcemap-loader":"0.3.8","karma-webpack":"5.0.0","lint-staged":"13.1.0","micromatch":"4.0.5","mocha":"10.2.0","netlify-cli":"12.2.8","node-fetch":"3.3.0","prettier":"2.8.1","promise-polyfill":"8.2.3","sauce-connect-launcher":"1.3.2","selenium-webdriver":"4.7.1","semver":"7.3.8","sinon":"14.0.2","sinon-chai":"3.7.0","typescript":"4.9.4","url-toolkit":"2.2.5","webpack":"5.75.0","webpack-cli":"5.0.1","webpack-dev-server":"4.11.1","webpack-merge":"5.8.0"},"version":"1.3.3"}
{
"name": "hls.js",
"license": "Apache-2.0",
"description": "JavaScript HLS client using MediaSourceExtension",
"homepage": "https://github.com/video-dev/hls.js",
"authors": "Guillaume du Pontavice <g.du.pontavice@gmail.com>",
"repository": {
"type": "git",
"url": "https://github.com/video-dev/hls.js"
},
"bugs": {
"url": "https://github.com/video-dev/hls.js/issues"
},
"main": "./dist/hls.js",
"module": "./dist/hls.mjs",
"types": "./dist/hls.js.d.ts",
"exports": {
".": {
"types": "./dist/hls.js.d.ts",
"import": "./dist/hls.mjs",
"require": "./dist/hls.js"
},
"./dist/*": "./dist/*",
"./package.json": "./package.json"
},
"files": [
"dist/**/*",
"src/**/*"
],
"publishConfig": {
"access": "public"
},
"scripts": {
"build": "rollup --config && npm run build:types",
"build:ci": "rollup --config && tsc --build tsconfig-lib.json && api-extractor run && es-check",
"build:debug": "rollup --config --configType full --configType demo",
"build:watch": "rollup --config --configType full --configType demo --watch",
"build:types": "tsc --build tsconfig-lib.json && api-extractor run --local",
"dev": "run-p build:watch serve",
"serve": "http-server -o '/demo' .",
"docs": "doctoc ./docs/API.md && api-documenter markdown -i api-extractor -o api-extractor/api-documenter && rm api-extractor/api-documenter/index.md && npm run docs-md-to-html",
"docs-md-to-html": "generate-md --layout github --input api-extractor/api-documenter --output api-docs",
"lint": "eslint src/ tests/ --ext .js --ext .ts",
"lint:fix": "npm run lint -- --fix",
"lint:quiet": "npm run lint -- --quiet",
"lint:staged": "lint-staged",
"prettier": "prettier --write .",
"prettier:verify": "prettier --check .",
"pretest": "npm run lint",
"sanity-check": "npm run lint && npm run prettier:verify && npm run type-check && npm run build && es-check && npm run docs && npm run test:unit",
"start": "npm run dev",
"test": "npm run test:unit && npm run test:func",
"test:unit": "karma start karma.conf.js",
"test:unit:debug": "DEBUG_UNIT_TESTS=1 karma start karma.conf.js --auto-watch --no-single-run --browsers Chrome",
"test:unit:watch": "karma start karma.conf.js --auto-watch --no-single-run",
"test:func": "BABEL_ENV=development mocha --require @babel/register tests/functional/auto/setup.js --timeout 40000 --exit",
"test:func:light": "BABEL_ENV=development HLSJS_LIGHT=1 mocha --require @babel/register tests/functional/auto/setup.js --timeout 40000 --exit",
"test:func:sauce": "SAUCE=1 UA=safari OS='OS X 10.15' BABEL_ENV=development mocha --require @babel/register tests/functional/auto/setup.js --timeout 40000 --exit",
"type-check": "tsc --noEmit",
"type-check:watch": "npm run type-check -- --watch",
"prepare": "husky install"
},
"devDependencies": {
"@babel/core": "7.21.8",
"@babel/helper-module-imports": "7.21.4",
"@babel/plugin-proposal-class-properties": "7.18.6",
"@babel/plugin-proposal-object-rest-spread": "7.20.7",
"@babel/plugin-proposal-optional-chaining": "7.21.0",
"@babel/plugin-transform-object-assign": "7.18.6",
"@babel/preset-env": "7.21.5",
"@babel/preset-typescript": "7.21.5",
"@babel/register": "7.21.0",
"@microsoft/api-documenter": "7.22.4",
"@microsoft/api-extractor": "7.34.8",
"@rollup/plugin-alias": "5.0.0",
"@rollup/plugin-babel": "6.0.3",
"@rollup/plugin-commonjs": "24.1.0",
"@rollup/plugin-node-resolve": "15.0.2",
"@rollup/plugin-replace": "5.0.2",
"@rollup/plugin-terser": "0.4.1",
"@rollup/plugin-typescript": "11.1.0",
"@types/chai": "4.3.5",
"@types/chart.js": "2.9.37",
"@types/mocha": "10.0.1",
"@types/sinon-chai": "3.2.9",
"@typescript-eslint/eslint-plugin": "5.59.2",
"@typescript-eslint/parser": "5.59.2",
"babel-loader": "9.1.2",
"babel-plugin-transform-remove-console": "6.9.4",
"chai": "4.3.7",
"chart.js": "2.9.4",
"chromedriver": "112.0.1",
"doctoc": "2.2.1",
"es-check": "7.1.1",
"eslint": "8.39.0",
"eslint-config-prettier": "8.8.0",
"eslint-plugin-import": "2.27.5",
"eslint-plugin-mocha": "10.1.0",
"eslint-plugin-node": "11.1.0",
"eslint-plugin-promise": "6.1.1",
"eventemitter3": "5.0.1",
"http-server": "14.1.1",
"husky": "8.0.3",
"jsonpack": "1.1.5",
"karma": "6.4.2",
"karma-chrome-launcher": "3.2.0",
"karma-coverage": "2.2.0",
"karma-mocha": "2.0.1",
"karma-mocha-reporter": "2.2.5",
"karma-rollup-preprocessor": "7.0.8",
"karma-sinon-chai": "2.0.2",
"karma-sourcemap-loader": "0.4.0",
"lint-staged": "13.2.2",
"markdown-styles": "3.2.0",
"micromatch": "4.0.5",
"mocha": "10.2.0",
"node-fetch": "3.3.1",
"npm-run-all": "4.1.5",
"prettier": "2.8.8",
"promise-polyfill": "8.3.0",
"rollup": "3.21.5",
"rollup-plugin-istanbul": "4.0.0",
"sauce-connect-launcher": "1.3.2",
"selenium-webdriver": "4.9.0",
"semver": "7.5.0",
"sinon": "15.0.4",
"sinon-chai": "3.7.0",
"typescript": "5.0.4",
"url-toolkit": "2.2.5",
"wrangler": "2.20.0"
},
"version": "1.4.5"
}

281
node_modules/hls.js/src/config.ts generated vendored
View file

@ -11,6 +11,8 @@ import EMEController, {
MediaKeySessionContext,
} from './controller/eme-controller';
import CMCDController from './controller/cmcd-controller';
import ContentSteeringController from './controller/content-steering-controller';
import ErrorController from './controller/error-controller';
import XhrLoader from './utils/xhr-loader';
import FetchLoader, { fetchSupported } from './utils/fetch-loader';
import Cues from './utils/cues';
@ -32,6 +34,9 @@ export type ABRControllerConfig = {
abrEwmaSlowLive: number;
abrEwmaFastVoD: number;
abrEwmaSlowVoD: number;
/**
* Default bandwidth estimate in bits/s prior to collecting fragment bandwidth samples
*/
abrEwmaDefaultEstimate: number;
abrBandWidthFactor: number;
abrBandWidthUpFactor: number;
@ -44,6 +49,9 @@ export type BufferControllerConfig = {
appendErrorMaxRetry: number;
backBufferLength: number;
liveDurationInfinity: boolean;
/**
* @deprecated use backBufferLength
*/
liveBackBufferLength: number | null;
};
@ -111,9 +119,10 @@ export interface FragmentLoaderConstructor {
new (confg: HlsConfig): Loader<FragmentLoaderContext>;
}
/**
* @deprecated use fragLoadPolicy.default
*/
export type FragmentLoaderConfig = {
fLoader?: FragmentLoaderConstructor;
fragLoadingTimeOut: number;
fragLoadingMaxRetry: number;
fragLoadingRetryDelay: number;
@ -139,9 +148,10 @@ export interface PlaylistLoaderConstructor {
new (confg: HlsConfig): Loader<PlaylistLoaderContext>;
}
/**
* @deprecated use manifestLoadPolicy.default and playlistLoadPolicy.default
*/
export type PlaylistLoaderConfig = {
pLoader?: PlaylistLoaderConstructor;
manifestLoadingTimeOut: number;
manifestLoadingMaxRetry: number;
manifestLoadingRetryDelay: number;
@ -153,6 +163,33 @@ export type PlaylistLoaderConfig = {
levelLoadingMaxRetryTimeout: number;
};
export type HlsLoadPolicies = {
fragLoadPolicy: LoadPolicy;
keyLoadPolicy: LoadPolicy;
certLoadPolicy: LoadPolicy;
playlistLoadPolicy: LoadPolicy;
manifestLoadPolicy: LoadPolicy;
steeringManifestLoadPolicy: LoadPolicy;
};
export type LoadPolicy = {
default: LoaderConfig;
};
export type LoaderConfig = {
maxTimeToFirstByteMs: number; // Max time to first byte
maxLoadTimeMs: number; // Max time for load completion
timeoutRetry: RetryConfig | null;
errorRetry: RetryConfig | null;
};
export type RetryConfig = {
maxNumRetry: number; // Maximum number of retries
retryDelayMs: number; // Retry delay = 2^retryCount * retryDelayMs (exponential) or retryCount * retryDelayMs (linear)
maxRetryDelayMs: number; // Maximum delay between retries
backoff?: 'exponential' | 'linear'; // used to determine retry backoff duration (see retryDelayMs)
};
export type StreamControllerConfig = {
autoStartLoad: boolean;
startPosition: number;
@ -207,12 +244,15 @@ export type TSDemuxerConfig = {
export type HlsConfig = {
debug: boolean | ILogger;
enableWorker: boolean;
workerPath: null | string;
enableSoftwareAES: boolean;
minAutoBitrate: number;
ignoreDevicePixelRatio: boolean;
loader: { new (confg: HlsConfig): Loader<LoaderContext> };
fLoader?: FragmentLoaderConstructor;
pLoader?: PlaylistLoaderConstructor;
fetchSetup?: (context: LoaderContext, initParams: any) => Request;
xhrSetup?: (xhr: XMLHttpRequest, url: string) => void;
xhrSetup?: (xhr: XMLHttpRequest, url: string) => Promise<void> | void;
// Alt Audio
audioStreamController?: typeof AudioStreamController;
@ -226,10 +266,13 @@ export type HlsConfig = {
// CMCD
cmcd?: CMCDControllerConfig;
cmcdController?: typeof CMCDController;
// Content Steering
contentSteeringController?: typeof ContentSteeringController;
abrController: typeof AbrController;
bufferController: typeof BufferController;
capLevelController: typeof CapLevelController;
errorController: typeof ErrorController;
fpsController: typeof FPSController;
progressive: boolean;
lowLatencyMode: boolean;
@ -238,19 +281,30 @@ export type HlsConfig = {
CapLevelControllerConfig &
EMEControllerConfig &
FPSControllerConfig &
FragmentLoaderConfig &
LevelControllerConfig &
MP4RemuxerConfig &
PlaylistLoaderConfig &
StreamControllerConfig &
LatencyControllerConfig &
MetadataControllerConfig &
TimelineControllerConfig &
TSDemuxerConfig;
TSDemuxerConfig &
HlsLoadPolicies &
FragmentLoaderConfig &
PlaylistLoaderConfig;
// If possible, keep hlsDefaultConfig shallow
// It is cloned whenever a new Hls instance is created, by keeping the config
// shallow the properties are cloned, and we don't end up manipulating the default
const defaultLoadPolicy: LoaderConfig = {
maxTimeToFirstByteMs: 8000,
maxLoadTimeMs: 20000,
timeoutRetry: null,
errorRetry: null,
};
/**
* @ignore
* If possible, keep hlsDefaultConfig shallow
* It is cloned whenever a new Hls instance is created, by keeping the config
* shallow the properties are cloned, and we don't end up manipulating the default
*/
export const hlsDefaultConfig: HlsConfig = {
autoStartLoad: true, // used by stream-controller
startPosition: -1, // used by stream-controller
@ -274,23 +328,15 @@ export const hlsDefaultConfig: HlsConfig = {
liveMaxLatencyDuration: undefined, // used by latency-controller
maxLiveSyncPlaybackRate: 1, // used by latency-controller
liveDurationInfinity: false, // used by buffer-controller
/**
* @deprecated use backBufferLength
*/
liveBackBufferLength: null, // used by buffer-controller
maxMaxBufferLength: 600, // used by stream-controller
enableWorker: true, // used by demuxer
enableWorker: true, // used by transmuxer
workerPath: null, // used by transmuxer
enableSoftwareAES: true, // used by decrypter
manifestLoadingTimeOut: 10000, // used by playlist-loader
manifestLoadingMaxRetry: 1, // used by playlist-loader
manifestLoadingRetryDelay: 1000, // used by playlist-loader
manifestLoadingMaxRetryTimeout: 64000, // used by playlist-loader
startLevel: undefined, // used by level-controller
levelLoadingTimeOut: 10000, // used by playlist-loader
levelLoadingMaxRetry: 4, // used by playlist-loader
levelLoadingRetryDelay: 1000, // used by playlist-loader
levelLoadingMaxRetryTimeout: 64000, // used by playlist-loader
fragLoadingTimeOut: 20000, // used by fragment-loader
fragLoadingMaxRetry: 6, // used by fragment-loader
fragLoadingRetryDelay: 1000, // used by fragment-loader
fragLoadingMaxRetryTimeout: 64000, // used by fragment-loader
startFragPrefetch: false, // used by stream-controller
fpsDroppedMonitoringPeriod: 5000, // used by fps-controller
fpsDroppedMonitoringThreshold: 0.2, // used by fps-controller
@ -305,6 +351,7 @@ export const hlsDefaultConfig: HlsConfig = {
abrController: AbrController,
bufferController: BufferController,
capLevelController: CapLevelController,
errorController: ErrorController,
fpsController: FPSController,
stretchShortVideoTrack: false, // used by mp4-remuxer
maxAudioFramesDrift: 1, // used by mp4-remuxer
@ -324,7 +371,9 @@ export const hlsDefaultConfig: HlsConfig = {
widevineLicenseUrl: undefined, // used by eme-controller
drmSystems: {}, // used by eme-controller
drmSystemOptions: {}, // used by eme-controller
requestMediaKeySystemAccessFunc: requestMediaKeySystemAccess, // used by eme-controller
requestMediaKeySystemAccessFunc: __USE_EME_DRM__
? requestMediaKeySystemAccess
: null, // used by eme-controller
testBandwidth: true,
progressive: false,
lowLatencyMode: true,
@ -333,6 +382,109 @@ export const hlsDefaultConfig: HlsConfig = {
enableEmsgMetadataCues: true,
enableID3MetadataCues: true,
certLoadPolicy: {
default: defaultLoadPolicy,
},
keyLoadPolicy: {
default: {
maxTimeToFirstByteMs: 8000,
maxLoadTimeMs: 20000,
timeoutRetry: {
maxNumRetry: 1,
retryDelayMs: 1000,
maxRetryDelayMs: 20000,
backoff: 'linear',
},
errorRetry: {
maxNumRetry: 8,
retryDelayMs: 1000,
maxRetryDelayMs: 20000,
backoff: 'linear',
},
},
},
manifestLoadPolicy: {
default: {
maxTimeToFirstByteMs: Infinity,
maxLoadTimeMs: 20000,
timeoutRetry: {
maxNumRetry: 2,
retryDelayMs: 0,
maxRetryDelayMs: 0,
},
errorRetry: {
maxNumRetry: 1,
retryDelayMs: 1000,
maxRetryDelayMs: 8000,
},
},
},
playlistLoadPolicy: {
default: {
maxTimeToFirstByteMs: 10000,
maxLoadTimeMs: 20000,
timeoutRetry: {
maxNumRetry: 2,
retryDelayMs: 0,
maxRetryDelayMs: 0,
},
errorRetry: {
maxNumRetry: 2,
retryDelayMs: 1000,
maxRetryDelayMs: 8000,
},
},
},
fragLoadPolicy: {
default: {
maxTimeToFirstByteMs: 10000,
maxLoadTimeMs: 120000,
timeoutRetry: {
maxNumRetry: 4,
retryDelayMs: 0,
maxRetryDelayMs: 0,
},
errorRetry: {
maxNumRetry: 6,
retryDelayMs: 1000,
maxRetryDelayMs: 8000,
},
},
},
steeringManifestLoadPolicy: {
default: __USE_CONTENT_STEERING__
? {
maxTimeToFirstByteMs: 10000,
maxLoadTimeMs: 20000,
timeoutRetry: {
maxNumRetry: 2,
retryDelayMs: 0,
maxRetryDelayMs: 0,
},
errorRetry: {
maxNumRetry: 1,
retryDelayMs: 1000,
maxRetryDelayMs: 8000,
},
}
: defaultLoadPolicy,
},
// These default settings are deprecated in favor of the above policies
// and are maintained for backwards compatibility
manifestLoadingTimeOut: 10000,
manifestLoadingMaxRetry: 1,
manifestLoadingRetryDelay: 1000,
manifestLoadingMaxRetryTimeout: 64000,
levelLoadingTimeOut: 10000,
levelLoadingMaxRetry: 4,
levelLoadingRetryDelay: 1000,
levelLoadingMaxRetryTimeout: 64000,
fragLoadingTimeOut: 20000,
fragLoadingMaxRetry: 6,
fragLoadingRetryDelay: 1000,
fragLoadingMaxRetryTimeout: 64000,
// Dynamic Modules
...timelineConfig(),
subtitleStreamController: __USE_SUBTITLES__
@ -346,6 +498,9 @@ export const hlsDefaultConfig: HlsConfig = {
audioTrackController: __USE_ALT_AUDIO__ ? AudioTrackController : undefined,
emeController: __USE_EME_DRM__ ? EMEController : undefined,
cmcdController: __USE_CMCD__ ? CMCDController : undefined,
contentSteeringController: __USE_CONTENT_STEERING__
? ContentSteeringController
: undefined,
};
function timelineConfig(): TimelineControllerConfig {
@ -366,6 +521,9 @@ function timelineConfig(): TimelineControllerConfig {
};
}
/**
* @ignore
*/
export function mergeConfig(
defaultConfig: HlsConfig,
userConfig: Partial<HlsConfig>
@ -401,9 +559,80 @@ export function mergeConfig(
);
}
return Object.assign({}, defaultConfig, userConfig);
const defaultsCopy = deepCpy(defaultConfig);
// Backwards compatibility with deprecated config values
const deprecatedSettingTypes = ['manifest', 'level', 'frag'];
const deprecatedSettings = [
'TimeOut',
'MaxRetry',
'RetryDelay',
'MaxRetryTimeout',
];
deprecatedSettingTypes.forEach((type) => {
const policyName = `${type === 'level' ? 'playlist' : type}LoadPolicy`;
const policyNotSet = userConfig[policyName] === undefined;
const report: string[] = [];
deprecatedSettings.forEach((setting) => {
const deprecatedSetting = `${type}Loading${setting}`;
const value = userConfig[deprecatedSetting];
if (value !== undefined && policyNotSet) {
report.push(deprecatedSetting);
const settings: LoaderConfig = defaultsCopy[policyName].default;
userConfig[policyName] = { default: settings };
switch (setting) {
case 'TimeOut':
settings.maxLoadTimeMs = value;
settings.maxTimeToFirstByteMs = value;
break;
case 'MaxRetry':
settings.errorRetry!.maxNumRetry = value;
settings.timeoutRetry!.maxNumRetry = value;
break;
case 'RetryDelay':
settings.errorRetry!.retryDelayMs = value;
settings.timeoutRetry!.retryDelayMs = value;
break;
case 'MaxRetryTimeout':
settings.errorRetry!.maxRetryDelayMs = value;
settings.timeoutRetry!.maxRetryDelayMs = value;
break;
}
}
});
if (report.length) {
logger.warn(
`hls.js config: "${report.join(
'", "'
)}" setting(s) are deprecated, use "${policyName}": ${JSON.stringify(
userConfig[policyName]
)}`
);
}
});
return {
...defaultsCopy,
...userConfig,
};
}
function deepCpy(obj: any): any {
if (obj && typeof obj === 'object') {
if (Array.isArray(obj)) {
return obj.map(deepCpy);
}
return Object.keys(obj).reduce((result, key) => {
result[key] = deepCpy(obj[key]);
return result;
}, {});
}
return obj;
}
/**
* @ignore
*/
export function enableStreamingMode(config) {
const currentLoader = config.loader;
if (currentLoader !== FetchLoader && currentLoader !== XhrLoader) {

View file

@ -1,6 +1,5 @@
import EwmaBandWidthEstimator from '../utils/ewma-bandwidth-estimator';
import { Events } from '../events';
import { ErrorDetails, ErrorTypes } from '../errors';
import { PlaylistLevelType } from '../types/loader';
import { logger } from '../utils/logger';
import type { Fragment } from '../loader/fragment';
@ -11,16 +10,17 @@ import type {
FragLoadingData,
FragLoadedData,
FragBufferedData,
ErrorData,
LevelLoadedData,
LevelSwitchingData,
} from '../types/events';
import type { ComponentAPI } from '../types/component-api';
import type { AbrComponentAPI } from '../types/component-api';
class AbrController implements ComponentAPI {
class AbrController implements AbrComponentAPI {
protected hls: Hls;
private lastLevelLoadSec: number = 0;
private lastLoadedFragLevel: number = 0;
private _nextAutoLevel: number = -1;
private timer?: number;
private timer: number = -1;
private onCheck: Function = this._abandonRulesCheck.bind(this);
private fragCurrent: Fragment | null = null;
private partCurrent: Part | null = null;
@ -46,8 +46,8 @@ class AbrController implements ComponentAPI {
hls.on(Events.FRAG_LOADING, this.onFragLoading, this);
hls.on(Events.FRAG_LOADED, this.onFragLoaded, this);
hls.on(Events.FRAG_BUFFERED, this.onFragBuffered, this);
hls.on(Events.LEVEL_SWITCHING, this.onLevelSwitching, this);
hls.on(Events.LEVEL_LOADED, this.onLevelLoaded, this);
hls.on(Events.ERROR, this.onError, this);
}
protected unregisterListeners() {
@ -55,8 +55,8 @@ class AbrController implements ComponentAPI {
hls.off(Events.FRAG_LOADING, this.onFragLoading, this);
hls.off(Events.FRAG_LOADED, this.onFragLoaded, this);
hls.off(Events.FRAG_BUFFERED, this.onFragBuffered, this);
hls.off(Events.LEVEL_SWITCHING, this.onLevelSwitching, this);
hls.off(Events.LEVEL_LOADED, this.onLevelLoaded, this);
hls.off(Events.ERROR, this.onError, this);
}
public destroy() {
@ -69,17 +69,40 @@ class AbrController implements ComponentAPI {
protected onFragLoading(event: Events.FRAG_LOADING, data: FragLoadingData) {
const frag = data.frag;
if (frag.type === PlaylistLevelType.MAIN) {
if (!this.timer) {
this.fragCurrent = frag;
this.partCurrent = data.part ?? null;
this.timer = self.setInterval(this.onCheck, 100);
}
if (this.ignoreFragment(frag)) {
return;
}
this.fragCurrent = frag;
this.partCurrent = data.part ?? null;
this.clearTimer();
this.timer = self.setInterval(this.onCheck, 100);
}
protected onLevelSwitching(
event: Events.LEVEL_SWITCHING,
data: LevelSwitchingData
): void {
this.clearTimer();
}
private getTimeToLoadFrag(
timeToFirstByteSec: number,
bandwidth: number,
fragSizeBits: number,
isSwitch: boolean
) {
const fragLoadSec = timeToFirstByteSec + fragSizeBits / bandwidth;
const playlistLoadSec = isSwitch ? this.lastLevelLoadSec : 0;
return fragLoadSec + playlistLoadSec;
}
protected onLevelLoaded(event: Events.LEVEL_LOADED, data: LevelLoadedData) {
const config = this.hls.config;
const { total, bwEstimate } = data.stats;
// Total is the bytelength and bwEstimate in bits/sec
if (Number.isFinite(total) && Number.isFinite(bwEstimate)) {
this.lastLevelLoadSec = (8 * total) / bwEstimate;
}
if (data.details.live) {
this.bwEstimator.update(config.abrEwmaSlowLive, config.abrEwmaFastLive);
} else {
@ -98,8 +121,10 @@ class AbrController implements ComponentAPI {
return;
}
const now = performance.now();
const stats: LoaderStats = part ? part.stats : frag.stats;
const duration = part ? part.duration : frag.duration;
const timeLoading = now - stats.loading.start;
// If frag loading is aborted, complete, or from lowest level, stop timer and return
if (
stats.aborted ||
@ -127,35 +152,50 @@ class AbrController implements ComponentAPI {
return;
}
const requestDelay = performance.now() - stats.loading.start;
const ttfbEstimate = this.bwEstimator.getEstimateTTFB();
const playbackRate = Math.abs(media.playbackRate);
// In order to work with a stable bandwidth, only begin monitoring bandwidth after half of the fragment has been loaded
if (requestDelay <= (500 * duration) / playbackRate) {
// To maintain stable adaptive playback, only begin monitoring frag loading after half or more of its playback duration has passed
if (
timeLoading <=
Math.max(ttfbEstimate, 1000 * (duration / (playbackRate * 2)))
) {
return;
}
const loadedFirstByte = stats.loaded && stats.loading.first;
// bufferStarvationDelay is an estimate of the amount time (in seconds) it will take to exhaust the buffer
const bufferStarvationDelay = bufferInfo.len / playbackRate;
// Only downswitch if less than 2 fragment lengths are buffered
if (bufferStarvationDelay >= (2 * duration) / playbackRate) {
return;
}
const ttfb = stats.loading.first
? stats.loading.first - stats.loading.start
: -1;
const loadedFirstByte = stats.loaded && ttfb > -1;
const bwEstimate: number = this.bwEstimator.getEstimate();
const { levels, minAutoLevel } = hls;
const level = levels[frag.level];
const expectedLen =
stats.total ||
Math.max(stats.loaded, Math.round((duration * level.maxBitrate) / 8));
const loadRate = loadedFirstByte ? (stats.loaded * 1000) / requestDelay : 0;
let timeStreaming = timeLoading - ttfb;
if (timeStreaming < 1 && loadedFirstByte) {
timeStreaming = Math.min(timeLoading, (stats.loaded * 8) / bwEstimate);
}
const loadRate = loadedFirstByte
? (stats.loaded * 1000) / timeStreaming
: 0;
// fragLoadDelay is an estimate of the time (in seconds) it will take to buffer the remainder of the fragment
const fragLoadedDelay = loadRate
? (expectedLen - stats.loaded) / loadRate
: (expectedLen * 8) / bwEstimate;
// bufferStarvationDelay is an estimate of the amount time (in seconds) it will take to exhaust the buffer
const bufferStarvationDelay = bufferInfo.len / playbackRate;
: (expectedLen * 8) / bwEstimate + ttfbEstimate / 1000;
// Only downswitch if the time to finish loading the current fragment is greater than the amount of buffer left
if (fragLoadedDelay <= bufferStarvationDelay) {
return;
}
const bwe = loadRate ? loadRate * 8 : bwEstimate;
let fragLevelNextLoadedDelay: number = Number.POSITIVE_INFINITY;
let nextLoadLevel: number;
// Iterate through lower level and try to find the largest one that avoids rebuffering
@ -165,13 +205,14 @@ class AbrController implements ComponentAPI {
nextLoadLevel--
) {
// compute time to load next fragment at lower level
// 0.8 : consider only 80% of current bw to be conservative
// 8 = bits per byte (bps/Bps)
const levelNextBitrate = levels[nextLoadLevel].maxBitrate;
fragLevelNextLoadedDelay = loadRate
? (duration * levelNextBitrate) / (8 * 0.8 * loadRate)
: (duration * levelNextBitrate) / bwEstimate;
fragLevelNextLoadedDelay = this.getTimeToLoadFrag(
ttfbEstimate / 1000,
bwe,
duration * levelNextBitrate,
!levels[nextLoadLevel].details
);
if (fragLevelNextLoadedDelay < bufferStarvationDelay) {
break;
}
@ -181,26 +222,41 @@ class AbrController implements ComponentAPI {
if (fragLevelNextLoadedDelay >= fragLoadedDelay) {
return;
}
logger.warn(`Fragment ${frag.sn}${
// if estimated load time of new segment is completely unreasonable, ignore and do not emergency switch down
if (fragLevelNextLoadedDelay > duration * 10) {
return;
}
hls.nextLoadLevel = nextLoadLevel;
if (loadedFirstByte) {
// If there has been loading progress, sample bandwidth using loading time offset by minimum TTFB time
this.bwEstimator.sample(
timeLoading - Math.min(ttfbEstimate, ttfb),
stats.loaded
);
} else {
// If there has been no loading progress, sample TTFB
this.bwEstimator.sampleTTFB(timeLoading);
}
this.clearTimer();
logger.warn(`[abr] Fragment ${frag.sn}${
part ? ' part ' + part.index : ''
} of level ${
frag.level
} is loading too slowly and will cause an underbuffer; aborting and switching to level ${nextLoadLevel}
} of level ${frag.level} is loading too slowly;
Time to underbuffer: ${bufferStarvationDelay.toFixed(3)} s
Estimated load time for current fragment: ${fragLoadedDelay.toFixed(3)} s
Estimated load time for down switch fragment: ${fragLevelNextLoadedDelay.toFixed(
3
)} s
TTFB estimate: ${ttfb}
Current BW estimate: ${
Number.isFinite(bwEstimate) ? (bwEstimate / 1024).toFixed(3) : 'Unknown'
} Kb/s
Estimated load time for current fragment: ${fragLoadedDelay.toFixed(3)} s
Estimated load time for the next fragment: ${fragLevelNextLoadedDelay.toFixed(
New BW estimate: ${(this.bwEstimator.getEstimate() / 1024).toFixed(
3
)} s
Time to underbuffer: ${bufferStarvationDelay.toFixed(3)} s`);
hls.nextLoadLevel = nextLoadLevel;
if (loadedFirstByte) {
// If there has been loading progress, sample bandwidth
this.bwEstimator.sample(requestDelay, stats.loaded);
}
this.clearTimer();
if (frag.loader || frag.keyLoader) {
)} Kb/s
Aborting and switching to level ${nextLoadLevel}`);
if (frag.loader) {
this.fragCurrent = this.partCurrent = null;
frag.abortRequests();
}
@ -211,38 +267,40 @@ class AbrController implements ComponentAPI {
event: Events.FRAG_LOADED,
{ frag, part }: FragLoadedData
) {
if (
frag.type === PlaylistLevelType.MAIN &&
Number.isFinite(frag.sn as number)
) {
const stats = part ? part.stats : frag.stats;
const duration = part ? part.duration : frag.duration;
// stop monitoring bw once frag loaded
this.clearTimer();
// store level id after successful fragment load
this.lastLoadedFragLevel = frag.level;
// reset forced auto level value so that next level will be selected
this._nextAutoLevel = -1;
const stats = part ? part.stats : frag.stats;
if (frag.type === PlaylistLevelType.MAIN) {
this.bwEstimator.sampleTTFB(stats.loading.first - stats.loading.start);
}
if (this.ignoreFragment(frag)) {
return;
}
// stop monitoring bw once frag loaded
this.clearTimer();
// store level id after successful fragment load
this.lastLoadedFragLevel = frag.level;
// reset forced auto level value so that next level will be selected
this._nextAutoLevel = -1;
// compute level average bitrate
if (this.hls.config.abrMaxWithRealBitrate) {
const level = this.hls.levels[frag.level];
const loadedBytes =
(level.loaded ? level.loaded.bytes : 0) + stats.loaded;
const loadedDuration =
(level.loaded ? level.loaded.duration : 0) + duration;
level.loaded = { bytes: loadedBytes, duration: loadedDuration };
level.realBitrate = Math.round((8 * loadedBytes) / loadedDuration);
}
if (frag.bitrateTest) {
const fragBufferedData: FragBufferedData = {
stats,
frag,
part,
id: frag.type,
};
this.onFragBuffered(Events.FRAG_BUFFERED, fragBufferedData);
}
// compute level average bitrate
if (this.hls.config.abrMaxWithRealBitrate) {
const duration = part ? part.duration : frag.duration;
const level = this.hls.levels[frag.level];
const loadedBytes =
(level.loaded ? level.loaded.bytes : 0) + stats.loaded;
const loadedDuration =
(level.loaded ? level.loaded.duration : 0) + duration;
level.loaded = { bytes: loadedBytes, duration: loadedDuration };
level.realBitrate = Math.round((8 * loadedBytes) / loadedDuration);
}
if (frag.bitrateTest) {
const fragBufferedData: FragBufferedData = {
stats,
frag,
part,
id: frag.type,
};
this.onFragBuffered(Events.FRAG_BUFFERED, fragBufferedData);
frag.bitrateTest = false;
}
}
@ -251,19 +309,24 @@ class AbrController implements ComponentAPI {
data: FragBufferedData
) {
const { frag, part } = data;
const stats = part ? part.stats : frag.stats;
const stats = part?.stats.loaded ? part.stats : frag.stats;
if (stats.aborted) {
return;
}
// Only count non-alt-audio frags which were actually buffered in our BW calculations
if (frag.type !== PlaylistLevelType.MAIN || frag.sn === 'initSegment') {
if (this.ignoreFragment(frag)) {
return;
}
// Use the difference between parsing and request instead of buffering and request to compute fragLoadingProcessing;
// rationale is that buffer appending only happens once media is attached. This can happen when config.startFragPrefetch
// is used. If we used buffering in that case, our BW estimate sample will be very large.
const processingMs = stats.parsing.end - stats.loading.start;
const processingMs =
stats.parsing.end -
stats.loading.start -
Math.min(
stats.loading.first - stats.loading.start,
this.bwEstimator.getEstimateTTFB()
);
this.bwEstimator.sample(processingMs, stats.loaded);
stats.bwEstimate = this.bwEstimator.getEstimate();
if (frag.bitrateTest) {
@ -273,29 +336,13 @@ class AbrController implements ComponentAPI {
}
}
protected onError(event: Events.ERROR, data: ErrorData) {
// stop timer in case of frag loading error
if (data.frag?.type === PlaylistLevelType.MAIN) {
if (data.type === ErrorTypes.KEY_SYSTEM_ERROR) {
this.clearTimer();
return;
}
switch (data.details) {
case ErrorDetails.FRAG_LOAD_ERROR:
case ErrorDetails.FRAG_LOAD_TIMEOUT:
case ErrorDetails.KEY_LOAD_ERROR:
case ErrorDetails.KEY_LOAD_TIMEOUT:
this.clearTimer();
break;
default:
break;
}
}
private ignoreFragment(frag: Fragment): boolean {
// Only count non-alt-audio frags which were actually buffered in our BW calculations
return frag.type !== PlaylistLevelType.MAIN || frag.sn === 'initSegment';
}
clearTimer() {
public clearTimer() {
self.clearInterval(this.timer);
this.timer = undefined;
}
// return next auto level
@ -310,8 +357,14 @@ class AbrController implements ComponentAPI {
// compute next level using ABR logic
let nextABRAutoLevel = this.getNextABRAutoLevel();
// use forced auto level when ABR selected level has errored
if (forcedAutoLevel !== -1 && this.hls.levels[nextABRAutoLevel].loadError) {
return forcedAutoLevel;
if (forcedAutoLevel !== -1) {
const levels = this.hls.levels;
if (
levels.length > Math.max(forcedAutoLevel, nextABRAutoLevel) &&
levels[forcedAutoLevel].loadError <= levels[nextABRAutoLevel].loadError
) {
return forcedAutoLevel;
}
}
// if forced auto level has been defined, use it to cap ABR computed quality level
if (forcedAutoLevel !== -1) {
@ -321,7 +374,7 @@ class AbrController implements ComponentAPI {
return nextABRAutoLevel;
}
private getNextABRAutoLevel() {
private getNextABRAutoLevel(): number {
const { fragCurrent, partCurrent, hls } = this;
const { maxAutoLevel, config, minAutoLevel, media } = hls;
const currentFragDuration = partCurrent
@ -355,7 +408,7 @@ class AbrController implements ComponentAPI {
return bestLevel;
}
logger.trace(
`${
`[abr] ${
bufferStarvationDelay ? 'rebuffering expected' : 'buffer is empty'
}, finding optimal quality level`
);
@ -381,7 +434,7 @@ class AbrController implements ComponentAPI {
: config.maxLoadingDelay;
maxStarvationDelay = maxLoadingDelay - bitrateTestDelay;
logger.trace(
`bitrate test took ${Math.round(
`[abr] bitrate test took ${Math.round(
1000 * bitrateTestDelay
)}ms, set first fragment max fetchDuration to ${Math.round(
1000 * maxStarvationDelay
@ -425,6 +478,10 @@ class AbrController implements ComponentAPI {
: fragCurrent
? fragCurrent.duration
: 0;
const ttfbEstimateSec = this.bwEstimator.getEstimateTTFB() / 1000;
let levelSkippedMin = minAutoLevel;
let levelSkippedMax = -1;
for (let i = maxAutoLevel; i >= minAutoLevel; i--) {
const levelInfo = levels[i];
@ -432,8 +489,17 @@ class AbrController implements ComponentAPI {
!levelInfo ||
(currentCodecSet && levelInfo.codecSet !== currentCodecSet)
) {
if (levelInfo) {
levelSkippedMin = Math.min(i, levelSkippedMin);
levelSkippedMax = Math.max(i, levelSkippedMax);
}
continue;
}
if (levelSkippedMax !== -1) {
logger.trace(
`[abr] Skipped level(s) ${levelSkippedMin}-${levelSkippedMax} with CODECS:"${levels[levelSkippedMax].attrs.CODECS}"; not compatible with "${level.attrs.CODECS}"`
);
}
const levelDetails = levelInfo.details;
const avgDuration =
@ -455,12 +521,21 @@ class AbrController implements ComponentAPI {
}
const bitrate: number = levels[i].maxBitrate;
const fetchDuration: number = (bitrate * avgDuration) / adjustedbw;
const fetchDuration: number = this.getTimeToLoadFrag(
ttfbEstimateSec,
adjustedbw,
bitrate * avgDuration,
levelDetails === undefined
);
logger.trace(
`level/adjustedbw/bitrate/avgDuration/maxFetchDuration/fetchDuration: ${i}/${Math.round(
adjustedbw
)}/${bitrate}/${avgDuration}/${maxFetchDuration}/${fetchDuration}`
`[abr] level:${i} adjustedbw-bitrate:${Math.round(
adjustedbw - bitrate
)} avgDuration:${avgDuration.toFixed(
1
)} maxFetchDuration:${maxFetchDuration.toFixed(
1
)} fetchDuration:${fetchDuration.toFixed(1)}`
);
// if adjusted bw is greater than level bitrate AND
if (

View file

@ -3,14 +3,14 @@ import { Events } from '../events';
import { Bufferable, BufferHelper } from '../utils/buffer-helper';
import { FragmentState } from './fragment-tracker';
import { Level } from '../types/level';
import { PlaylistLevelType } from '../types/loader';
import { PlaylistContextType, PlaylistLevelType } from '../types/loader';
import { Fragment, ElementaryStreamTypes, Part } from '../loader/fragment';
import ChunkCache from '../demux/chunk-cache';
import TransmuxerInterface from '../demux/transmuxer-interface';
import { ChunkMetadata } from '../types/transmuxer';
import { fragmentWithinToleranceTest } from './fragment-finders';
import { alignMediaPlaylistByPDT } from '../utils/discontinuities';
import { ErrorDetails, ErrorTypes } from '../errors';
import { ErrorDetails } from '../errors';
import type { NetworkComponentAPI } from '../types/component-api';
import type Hls from '../hls';
import type { FragmentTracker } from './fragment-tracker';
@ -33,6 +33,7 @@ import type {
FragBufferedData,
ErrorData,
} from '../types/events';
import type { MediaPlaylist } from '../types/media-playlist';
const TICK_INTERVAL = 100; // how often to tick in ms
@ -50,7 +51,8 @@ class AudioStreamController
private videoBuffer: Bufferable | null = null;
private videoTrackCC: number = -1;
private waitingVideoCC: number = -1;
private audioSwitch: boolean = false;
private bufferedTrack: MediaPlaylist | null = null;
private switchingTrack: MediaPlaylist | null = null;
private trackId: number = -1;
private waitingData: WaitingForPTSData | null = null;
private mainDetails: LevelDetails | null = null;
@ -62,13 +64,21 @@ class AudioStreamController
fragmentTracker: FragmentTracker,
keyLoader: KeyLoader
) {
super(hls, fragmentTracker, keyLoader, '[audio-stream-controller]');
super(
hls,
fragmentTracker,
keyLoader,
'[audio-stream-controller]',
PlaylistLevelType.AUDIO
);
this._registerListeners();
}
protected onHandlerDestroying() {
this._unregisterListeners();
this.mainDetails = null;
this.bufferedTrack = null;
this.switchingTrack = null;
}
private _registerListeners() {
@ -108,13 +118,13 @@ class AudioStreamController
// INIT_PTS_FOUND is triggered when the video track parsed in the stream-controller has a new PTS value
onInitPtsFound(
event: Events.INIT_PTS_FOUND,
{ frag, id, initPTS }: InitPTSFoundData
{ frag, id, initPTS, timescale }: InitPTSFoundData
) {
// Always update the new INIT PTS
// Can change due level switch
if (id === 'main') {
const cc = frag.cc;
this.initPTS[frag.cc] = initPTS;
this.initPTS[frag.cc] = { baseTime: initPTS, timescale };
this.log(`InitPTS for cc: ${cc} found from main: ${initPTS}`);
this.videoTrackCC = cc;
// If we are waiting, tick immediately to unblock audio fragment transmuxing
@ -133,7 +143,6 @@ class AudioStreamController
const lastCurrentTime = this.lastCurrentTime;
this.stopLoad();
this.setInterval(TICK_INTERVAL);
this.fragLoadError = 0;
if (lastCurrentTime > 0 && startPosition === -1) {
this.log(
`Override startPosition with lastCurrentTime @${lastCurrentTime.toFixed(
@ -253,7 +262,7 @@ class AudioStreamController
protected onTickEnd() {
const { media } = this;
if (!media || !media.readyState) {
if (!media?.readyState) {
// Exit early if we don't have media or if the media hasn't buffered anything yet (readyState 0)
return;
}
@ -265,7 +274,7 @@ class AudioStreamController
const { hls, levels, media, trackId } = this;
const config = hls.config;
if (!levels || !levels[trackId]) {
if (!levels?.[trackId]) {
return;
}
@ -306,9 +315,9 @@ class AudioStreamController
if (bufferInfo === null) {
return;
}
const audioSwitch = this.audioSwitch;
const { bufferedTrack, switchingTrack } = this;
if (!audioSwitch && this._streamEnded(bufferInfo, trackDetails)) {
if (!switchingTrack && this._streamEnded(bufferInfo, trackDetails)) {
hls.trigger(Events.BUFFER_EOS, { type: 'audio' });
this.state = State.ENDED;
return;
@ -322,16 +331,18 @@ class AudioStreamController
const maxBufLen = this.getMaxBufferLength(mainBufferInfo?.len);
// if buffer length is less than maxBufLen try to load a new fragment
if (bufferLen >= maxBufLen && !audioSwitch) {
if (bufferLen >= maxBufLen && !switchingTrack) {
return;
}
const fragments = trackDetails.fragments;
const start = fragments[0].start;
let targetBufferTime = bufferInfo.end;
if (audioSwitch && media) {
if (switchingTrack && media) {
const pos = this.getLoadPosition();
targetBufferTime = pos;
if (bufferedTrack && switchingTrack.attrs !== bufferedTrack.attrs) {
targetBufferTime = pos;
}
// if currentTime (pos) is less than alt audio playlist start time, it means that alt audio is ahead of currentTime
if (trackDetails.PTSKnown && pos < start) {
// if everything is buffered from pos to start or if audio buffer upfront, let's seek to start
@ -344,25 +355,50 @@ class AudioStreamController
}
}
// buffer audio up to one target duration ahead of main buffer
if (
mainBufferInfo &&
targetBufferTime > mainBufferInfo.end + trackDetails.targetduration
) {
return;
let frag = this.getNextFragment(targetBufferTime, trackDetails);
let atGap = false;
// Avoid loop loading by using nextLoadPosition set for backtracking and skipping consecutive GAP tags
if (frag && this.isLoopLoading(frag, targetBufferTime)) {
atGap = !!frag.gap;
frag = this.getNextFragmentLoopLoading(
frag,
trackDetails,
bufferInfo,
PlaylistLevelType.MAIN,
maxBufLen
);
}
// wait for main buffer after buffing some audio
if ((!mainBufferInfo || !mainBufferInfo.len) && bufferInfo.len) {
return;
}
const frag = this.getNextFragment(targetBufferTime, trackDetails);
if (!frag) {
this.bufferFlushed = true;
return;
}
this.loadFragment(frag, trackDetails, targetBufferTime);
// Buffer audio up to one target duration ahead of main buffer
const atBufferSyncLimit =
mainBufferInfo &&
frag.start > mainBufferInfo.end + trackDetails.targetduration;
if (
atBufferSyncLimit ||
// Or wait for main buffer after buffing some audio
(!mainBufferInfo?.len && bufferInfo.len)
) {
// Check fragment-tracker for main fragments since GAP segments do not show up in bufferInfo
const mainFrag = this.getAppendedFrag(frag.start, PlaylistLevelType.MAIN);
if (mainFrag === null) {
return;
}
// Bridge gaps in main buffer
atGap ||=
!!mainFrag.gap || (!!atBufferSyncLimit && mainBufferInfo.len === 0);
if (
(atBufferSyncLimit && !atGap) ||
(atGap && bufferInfo.nextStart && bufferInfo.nextStart < mainFrag.end)
) {
return;
}
}
this.loadFragment(frag, levelInfo, targetBufferTime);
}
protected getMaxBufferLength(mainBufferLength?: number): number {
@ -370,7 +406,10 @@ class AudioStreamController
if (!mainBufferLength) {
return maxConfigBuffer;
}
return Math.max(maxConfigBuffer, mainBufferLength);
return Math.min(
Math.max(maxConfigBuffer, mainBufferLength),
this.config.maxMaxBufferLength
);
}
onMediaDetaching() {
@ -397,9 +436,9 @@ class AudioStreamController
if (fragCurrent) {
fragCurrent.abortRequests();
this.removeUnbufferedFrags(fragCurrent.start);
}
this.fragCurrent = null;
this.clearWaitingFragment();
this.resetLoadingState();
// destroy useless transmuxer when switching audio to main
if (!altAudio) {
this.resetTransmuxer();
@ -410,20 +449,30 @@ class AudioStreamController
// should we switch tracks ?
if (altAudio) {
this.audioSwitch = true;
this.switchingTrack = data;
// main audio track are handled by stream-controller, just do something if switching to alt audio track
this.state = State.IDLE;
} else {
this.switchingTrack = null;
this.bufferedTrack = data;
this.state = State.STOPPED;
}
this.tick();
}
onManifestLoading() {
this.mainDetails = null;
this.fragmentTracker.removeAllFragments();
this.startPosition = this.lastCurrentTime = 0;
this.bufferFlushed = false;
this.levels =
this.mainDetails =
this.waitingData =
this.bufferedTrack =
this.cachedTrackLoadedData =
this.switchingTrack =
null;
this.startFragRequested = false;
this.trackId = this.videoTrackCC = this.waitingVideoCC = -1;
}
onLevelLoaded(event: Events.LEVEL_LOADED, data: LevelLoadedData) {
@ -446,7 +495,11 @@ class AudioStreamController
return;
}
this.log(
`Track ${trackId} loaded [${newDetails.startSN},${newDetails.endSN}],duration:${newDetails.totalduration}`
`Track ${trackId} loaded [${newDetails.startSN},${newDetails.endSN}]${
newDetails.lastPartSn
? `[part-${newDetails.lastPartSn}-${newDetails.lastPartIndex}]`
: ''
},duration:${newDetails.totalduration}`
);
const track = levels[trackId];
@ -502,12 +555,16 @@ class AudioStreamController
}
const track = levels[trackId] as Level;
console.assert(track, 'Audio track is defined on fragment load progress');
if (!track) {
this.warn('Audio track is undefined on fragment load progress');
return;
}
const details = track.details as LevelDetails;
console.assert(
details,
'Audio track details are defined on fragment load progress'
);
if (!details) {
this.warn('Audio track details undefined on fragment load progress');
this.removeUnbufferedFrags(frag.start);
return;
}
const audioCodec =
config.defaultAudioCodec || track.audioCodec || 'mp4a.40.2';
@ -595,8 +652,12 @@ class AudioStreamController
const { frag, part } = data;
if (frag.type !== PlaylistLevelType.AUDIO) {
if (!this.loadedmetadata && frag.type === PlaylistLevelType.MAIN) {
if ((this.videoBuffer || this.media)?.buffered.length) {
this.loadedmetadata = true;
const bufferable = this.videoBuffer || this.media;
if (bufferable) {
const bufferedTimeRanges = BufferHelper.getBuffered(bufferable);
if (bufferedTimeRanges.length) {
this.loadedmetadata = true;
}
}
}
return;
@ -609,73 +670,62 @@ class AudioStreamController
frag.level
} finished buffering, but was aborted. state: ${
this.state
}, audioSwitch: ${this.audioSwitch}`
}, audioSwitch: ${
this.switchingTrack ? this.switchingTrack.name : 'false'
}`
);
return;
}
if (frag.sn !== 'initSegment') {
this.fragPrevious = frag;
if (this.audioSwitch) {
this.audioSwitch = false;
this.hls.trigger(Events.AUDIO_TRACK_SWITCHED, { id: this.trackId });
const track = this.switchingTrack;
if (track) {
this.bufferedTrack = track;
this.switchingTrack = null;
this.hls.trigger(Events.AUDIO_TRACK_SWITCHED, { ...track });
}
}
this.fragBufferedComplete(frag, part);
}
private onError(event: Events.ERROR, data: ErrorData) {
if (data.type === ErrorTypes.KEY_SYSTEM_ERROR) {
this.onFragmentOrKeyLoadError(PlaylistLevelType.AUDIO, data);
if (data.fatal) {
this.state = State.ERROR;
return;
}
switch (data.details) {
case ErrorDetails.FRAG_GAP:
case ErrorDetails.FRAG_PARSING_ERROR:
case ErrorDetails.FRAG_DECRYPT_ERROR:
case ErrorDetails.FRAG_LOAD_ERROR:
case ErrorDetails.FRAG_LOAD_TIMEOUT:
case ErrorDetails.FRAG_PARSING_ERROR:
case ErrorDetails.KEY_LOAD_ERROR:
case ErrorDetails.KEY_LOAD_TIMEOUT:
// TODO: Skip fragments that do not belong to this.fragCurrent audio-group id
this.onFragmentOrKeyLoadError(PlaylistLevelType.AUDIO, data);
break;
case ErrorDetails.AUDIO_TRACK_LOAD_ERROR:
case ErrorDetails.AUDIO_TRACK_LOAD_TIMEOUT:
// when in ERROR state, don't switch back to IDLE state in case a non-fatal error is received
if (this.state !== State.ERROR && this.state !== State.STOPPED) {
// if fatal error, stop processing, otherwise move to IDLE to retry loading
this.state = data.fatal ? State.ERROR : State.IDLE;
this.warn(
`${data.details} while loading frag, switching to ${this.state} state`
);
case ErrorDetails.LEVEL_PARSING_ERROR:
// in case of non fatal error while loading track, if not retrying to load track, switch back to IDLE
if (
!data.levelRetry &&
this.state === State.WAITING_TRACK &&
data.context?.type === PlaylistContextType.AUDIO_TRACK
) {
this.state = State.IDLE;
}
break;
case ErrorDetails.BUFFER_FULL_ERROR:
// if in appending state
if (
data.parent === 'audio' &&
(this.state === State.PARSING || this.state === State.PARSED)
) {
let flushBuffer = true;
const bufferedInfo = this.getFwdBufferInfo(
this.mediaBuffer,
PlaylistLevelType.AUDIO
);
// 0.5 : tolerance needed as some browsers stalls playback before reaching buffered end
// reduce max buf len if current position is buffered
if (bufferedInfo && bufferedInfo.len > 0.5) {
flushBuffer = !this.reduceMaxBufferLength(bufferedInfo.len);
}
if (flushBuffer) {
// current position is not buffered, but browser is still complaining about buffer full error
// this happens on IE/Edge, refer to https://github.com/video-dev/hls.js/pull/708
// in that case flush the whole audio buffer to recover
this.warn(
'Buffer full error also media.currentTime is not buffered, flush audio buffer'
);
this.fragCurrent = null;
super.flushMainBuffer(0, Number.POSITIVE_INFINITY, 'audio');
}
this.resetLoadingState();
if (!data.parent || data.parent !== 'audio') {
return;
}
if (this.reduceLengthAndFlushBuffer(data)) {
this.bufferedTrack = null;
super.flushMainBuffer(0, Number.POSITIVE_INFINITY, 'audio');
}
break;
case ErrorDetails.INTERNAL_EXCEPTION:
this.recoverWorkerError(data);
break;
default:
break;
@ -701,34 +751,30 @@ class AudioStreamController
const context = this.getCurrentContext(chunkMeta);
if (!context) {
this.warn(
`The loading context changed while buffering fragment ${chunkMeta.sn} of level ${chunkMeta.level}. This chunk will not be buffered.`
);
this.resetStartWhenNotLoaded(chunkMeta.level);
this.resetWhenMissingContext(chunkMeta);
return;
}
const {
frag,
part,
level: { details },
} = context;
const { frag, part, level } = context;
const { details } = level;
const { audio, text, id3, initSegment } = remuxResult;
// Check if the current fragment has been aborted. We check this by first seeing if we're still playing the current level.
// If we are, subsequently check if the currently loading fragment (fragCurrent) has changed.
if (this.fragContextChanged(frag) || !details) {
this.fragmentTracker.removeFragment(frag);
return;
}
this.state = State.PARSING;
if (this.audioSwitch && audio) {
this.completeAudioSwitch();
if (this.switchingTrack && audio) {
this.completeAudioSwitch(this.switchingTrack);
}
if (initSegment?.tracks) {
this._bufferInitSegment(initSegment.tracks, frag, chunkMeta);
const mapFragment = frag.initSegment || frag;
this._bufferInitSegment(initSegment.tracks, mapFragment, chunkMeta);
hls.trigger(Events.FRAG_PARSING_INIT_SEGMENT, {
frag,
frag: mapFragment,
id,
tracks: initSegment.tracks,
});
@ -821,7 +867,7 @@ class AudioStreamController
protected loadFragment(
frag: Fragment,
trackDetails: LevelDetails,
track: Level,
targetBufferTime: number
) {
// only load if fragment is not loaded or if in audio switch
@ -830,32 +876,43 @@ class AudioStreamController
// we force a frag loading in audio switch as fragment tracker might not have evicted previous frags in case of quick audio switch
if (
this.audioSwitch ||
this.switchingTrack ||
fragState === FragmentState.NOT_LOADED ||
fragState === FragmentState.PARTIAL
) {
if (frag.sn === 'initSegment') {
this._loadInitSegment(frag, trackDetails);
} else if (trackDetails.live && !Number.isFinite(this.initPTS[frag.cc])) {
this._loadInitSegment(frag, track);
} else if (track.details?.live && !this.initPTS[frag.cc]) {
this.log(
`Waiting for video PTS in continuity counter ${frag.cc} of live stream before loading audio fragment ${frag.sn} of level ${this.trackId}`
);
this.state = State.WAITING_INIT_PTS;
} else {
this.startFragRequested = true;
super.loadFragment(frag, trackDetails, targetBufferTime);
super.loadFragment(frag, track, targetBufferTime);
}
} else {
this.clearTrackerIfNeeded(frag);
}
}
private completeAudioSwitch() {
const { hls, media, trackId } = this;
if (media) {
private completeAudioSwitch(switchingTrack: MediaPlaylist) {
const { hls, media, bufferedTrack } = this;
const bufferedAttributes = bufferedTrack?.attrs;
const switchAttributes = switchingTrack.attrs;
if (
media &&
bufferedAttributes &&
(bufferedAttributes.CHANNELS !== switchAttributes.CHANNELS ||
bufferedAttributes.NAME !== switchAttributes.NAME ||
bufferedAttributes.LANGUAGE !== switchAttributes.LANGUAGE)
) {
this.log('Switching audio track : flushing all audio');
super.flushMainBuffer(0, Number.POSITIVE_INFINITY, 'audio');
}
this.audioSwitch = false;
hls.trigger(Events.AUDIO_TRACK_SWITCHED, { id: trackId });
this.bufferedTrack = switchingTrack;
this.switchingTrack = null;
hls.trigger(Events.AUDIO_TRACK_SWITCHED, { ...switchingTrack });
}
}
export default AudioStreamController;

View file

@ -19,7 +19,7 @@ class AudioTrackController extends BasePlaylistController {
private groupId: string | null = null;
private tracksInGroup: MediaPlaylist[] = [];
private trackId: number = -1;
private trackName: string = '';
private currentTrack: MediaPlaylist | null = null;
private selectDefaultTrack: boolean = true;
constructor(hls: Hls) {
@ -51,6 +51,7 @@ class AudioTrackController extends BasePlaylistController {
this.unregisterListeners();
this.tracks.length = 0;
this.tracksInGroup.length = 0;
this.currentTrack = null;
super.destroy();
}
@ -59,7 +60,7 @@ class AudioTrackController extends BasePlaylistController {
this.groupId = null;
this.tracksInGroup = [];
this.trackId = -1;
this.trackName = '';
this.currentTrack = null;
this.selectDefaultTrack = true;
}
@ -74,20 +75,23 @@ class AudioTrackController extends BasePlaylistController {
event: Events.AUDIO_TRACK_LOADED,
data: AudioTrackLoadedData
): void {
const { id, details } = data;
const currentTrack = this.tracksInGroup[id];
const { id, groupId, details } = data;
const trackInActiveGroup = this.tracksInGroup[id];
if (!currentTrack) {
this.warn(`Invalid audio track id ${id}`);
if (!trackInActiveGroup || trackInActiveGroup.groupId !== groupId) {
this.warn(
`Track with id:${id} and group:${groupId} not found in active group ${trackInActiveGroup.groupId}`
);
return;
}
const curDetails = currentTrack.details;
currentTrack.details = data.details;
this.log(`audioTrack ${id} loaded [${details.startSN}-${details.endSN}]`);
const curDetails = trackInActiveGroup.details;
trackInActiveGroup.details = data.details;
this.log(
`audio-track ${id} "${trackInActiveGroup.name}" lang:${trackInActiveGroup.lang} group:${groupId} loaded [${details.startSN}-${details.endSN}]`
);
if (id === this.trackId) {
this.retryCount = 0;
this.playlistLoaded(id, data, curDetails);
}
}
@ -115,7 +119,7 @@ class AudioTrackController extends BasePlaylistController {
const audioGroupId = levelInfo.audioGroupIds[levelInfo.urlId];
if (this.groupId !== audioGroupId) {
this.groupId = audioGroupId;
this.groupId = audioGroupId || null;
const audioTracks = this.tracks.filter(
(track): boolean => !audioGroupId || track.groupId === audioGroupId
@ -132,16 +136,18 @@ class AudioTrackController extends BasePlaylistController {
this.tracksInGroup = audioTracks;
const audioTracksUpdated: AudioTracksUpdatedData = { audioTracks };
this.log(
`Updating audio tracks, ${audioTracks.length} track(s) found in "${audioGroupId}" group-id`
`Updating audio tracks, ${audioTracks.length} track(s) found in group:${audioGroupId}`
);
this.hls.trigger(Events.AUDIO_TRACKS_UPDATED, audioTracksUpdated);
this.selectInitialTrack();
} else if (this.shouldReloadPlaylist(this.currentTrack)) {
// Retry playlist loading if no playlist is or has been loaded yet
this.setAudioTrack(this.trackId);
}
}
protected onError(event: Events.ERROR, data: ErrorData): void {
super.onError(event, data);
if (data.fatal || !data.context) {
return;
}
@ -151,7 +157,8 @@ class AudioTrackController extends BasePlaylistController {
data.context.id === this.trackId &&
data.context.groupId === this.groupId
) {
this.retryLoadingOrFail(data);
this.requestScheduled = -1;
this.checkRetry(data);
}
}
@ -181,20 +188,17 @@ class AudioTrackController extends BasePlaylistController {
// stopping live reloading timer if any
this.clearTimer();
const lastTrack = tracks[this.trackId];
this.log(`Now switching to audio-track index ${newId}`);
const lastTrack = this.currentTrack;
tracks[this.trackId];
const track = tracks[newId];
const { id, groupId = '', name, type, url } = track;
const { groupId, name } = track;
this.log(
`Switching to audio-track ${newId} "${name}" lang:${track.lang} group:${groupId}`
);
this.trackId = newId;
this.trackName = name;
this.currentTrack = track;
this.selectDefaultTrack = false;
this.hls.trigger(Events.AUDIO_TRACK_SWITCHING, {
id,
groupId,
name,
type,
url,
});
this.hls.trigger(Events.AUDIO_TRACK_SWITCHING, { ...track });
// Do not reload track unless live
if (track.details && !track.details.live) {
return;
@ -205,33 +209,43 @@ class AudioTrackController extends BasePlaylistController {
private selectInitialTrack(): void {
const audioTracks = this.tracksInGroup;
console.assert(
audioTracks.length,
'Initial audio track should be selected when tracks are known'
);
const currentAudioTrackName = this.trackName;
const trackId =
this.findTrackId(currentAudioTrackName) || this.findTrackId();
this.findTrackId(this.currentTrack) | this.findTrackId(null);
if (trackId !== -1) {
this.setAudioTrack(trackId);
} else {
this.warn(`No track found for running audio group-ID: ${this.groupId}`);
const error = new Error(
`No track found for running audio group-ID: ${this.groupId} track count: ${audioTracks.length}`
);
this.warn(error.message);
this.hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.AUDIO_TRACK_LOAD_ERROR,
fatal: true,
error,
});
}
}
private findTrackId(name?: string): number {
private findTrackId(currentTrack: MediaPlaylist | null): number {
const audioTracks = this.tracksInGroup;
for (let i = 0; i < audioTracks.length; i++) {
const track = audioTracks[i];
if (!this.selectDefaultTrack || track.default) {
if (!name || name === track.name) {
if (
!currentTrack ||
(currentTrack.attrs['STABLE-RENDITION-ID'] !== undefined &&
currentTrack.attrs['STABLE-RENDITION-ID'] ===
track.attrs['STABLE-RENDITION-ID'])
) {
return track.id;
}
if (
currentTrack.name === track.name &&
currentTrack.lang === track.lang
) {
return track.id;
}
}
@ -242,7 +256,7 @@ class AudioTrackController extends BasePlaylistController {
protected loadPlaylist(hlsUrlParameters?: HlsUrlParameters): void {
super.loadPlaylist();
const audioTrack = this.tracksInGroup[this.trackId];
if (this.shouldLoadTrack(audioTrack)) {
if (this.shouldLoadPlaylist(audioTrack)) {
const id = audioTrack.id;
const groupId = audioTrack.groupId as string;
let url = audioTrack.url;
@ -256,7 +270,9 @@ class AudioTrackController extends BasePlaylistController {
}
}
// track not retrieved yet, or live playlist we need to (re)load it
this.log(`loading audio-track playlist for id: ${id}`);
this.log(
`loading audio-track playlist ${id} "${audioTrack.name}" lang:${audioTrack.lang} group:${groupId}`
);
this.clearTimer();
this.hls.trigger(Events.AUDIO_TRACK_LOADING, {
url,

View file

@ -1,7 +1,10 @@
import type Hls from '../hls';
import type { NetworkComponentAPI } from '../types/component-api';
import { getSkipValue, HlsSkip, HlsUrlParameters } from '../types/level';
import { getSkipValue, HlsSkip, HlsUrlParameters, Level } from '../types/level';
import { computeReloadInterval, mergeDetails } from './level-helper';
import { ErrorData } from '../types/events';
import { getRetryDelay, isTimeoutError } from '../utils/error-helper';
import { NetworkErrorAction } from './error-controller';
import { logger } from '../utils/logger';
import type { LevelDetails } from '../loader/level-details';
import type { MediaPlaylist } from '../types/media-playlist';
@ -10,16 +13,12 @@ import type {
LevelLoadedData,
TrackLoadedData,
} from '../types/events';
import { ErrorData } from '../types/events';
import { Events } from '../events';
import { ErrorTypes } from '../errors';
export default class BasePlaylistController implements NetworkComponentAPI {
protected hls: Hls;
protected timer: number = -1;
protected requestScheduled: number = -1;
protected canLoad: boolean = false;
protected retryCount: number = 0;
protected log: (msg: any) => void;
protected warn: (msg: any) => void;
@ -35,16 +34,6 @@ export default class BasePlaylistController implements NetworkComponentAPI {
this.hls = this.log = this.warn = null;
}
protected onError(event: Events.ERROR, data: ErrorData): void {
if (
data.fatal &&
(data.type === ErrorTypes.NETWORK_ERROR ||
data.type === ErrorTypes.KEY_SYSTEM_ERROR)
) {
this.stopLoad();
}
}
protected clearTimer(): void {
clearTimeout(this.timer);
this.timer = -1;
@ -52,7 +41,6 @@ export default class BasePlaylistController implements NetworkComponentAPI {
public startLoad(): void {
this.canLoad = true;
this.retryCount = 0;
this.requestScheduled = -1;
this.loadPlaylist();
}
@ -68,6 +56,7 @@ export default class BasePlaylistController implements NetworkComponentAPI {
): HlsUrlParameters | undefined {
const renditionReports = previous?.renditionReports;
if (renditionReports) {
let foundIndex = -1;
for (let i = 0; i < renditionReports.length; i++) {
const attr = renditionReports[i];
let uri: string;
@ -79,25 +68,34 @@ export default class BasePlaylistController implements NetworkComponentAPI {
);
uri = attr.URI || '';
}
if (uri === playlistUri.slice(-uri.length)) {
const msn = parseInt(attr['LAST-MSN']) || previous?.lastPartSn;
let part = parseInt(attr['LAST-PART']) || previous?.lastPartIndex;
if (this.hls.config.lowLatencyMode) {
const currentGoal = Math.min(
previous.age - previous.partTarget,
previous.targetduration
);
if (part >= 0 && currentGoal > previous.partTarget) {
part += 1;
}
}
return new HlsUrlParameters(
msn,
part >= 0 ? part : undefined,
HlsSkip.No
);
// Use exact match. Otherwise, the last partial match, if any, will be used
// (Playlist URI includes a query string that the Rendition Report does not)
if (uri === playlistUri) {
foundIndex = i;
break;
} else if (uri === playlistUri.substring(0, uri.length)) {
foundIndex = i;
}
}
if (foundIndex !== -1) {
const attr = renditionReports[foundIndex];
const msn = parseInt(attr['LAST-MSN']) || previous?.lastPartSn;
let part = parseInt(attr['LAST-PART']) || previous?.lastPartIndex;
if (this.hls.config.lowLatencyMode) {
const currentGoal = Math.min(
previous.age - previous.partTarget,
previous.targetduration
);
if (part >= 0 && currentGoal > previous.partTarget) {
part += 1;
}
}
return new HlsUrlParameters(
msn,
part >= 0 ? part : undefined,
HlsSkip.No
);
}
}
}
@ -105,14 +103,27 @@ export default class BasePlaylistController implements NetworkComponentAPI {
if (this.requestScheduled === -1) {
this.requestScheduled = self.performance.now();
}
// Loading is handled by the subclasses
}
protected shouldLoadTrack(track: MediaPlaylist): boolean {
protected shouldLoadPlaylist(
playlist: Level | MediaPlaylist | null | undefined
): boolean {
return (
this.canLoad &&
track &&
!!track.url &&
(!track.details || track.details.live)
!!playlist &&
!!playlist.url &&
(!playlist.details || playlist.details.live)
);
}
protected shouldReloadPlaylist(
playlist: Level | MediaPlaylist | null | undefined
): boolean {
return (
this.timer === -1 &&
this.requestScheduled === -1 &&
this.shouldLoadPlaylist(playlist)
);
}
@ -149,7 +160,7 @@ export default class BasePlaylistController implements NetworkComponentAPI {
if (!this.canLoad || !details.live) {
return;
}
let deliveryDirectives: HlsUrlParameters;
let deliveryDirectives: HlsUrlParameters | undefined;
let msn: number | undefined = undefined;
let part: number | undefined = undefined;
if (details.canBlockReload && details.endSN && details.advanced) {
@ -213,7 +224,7 @@ export default class BasePlaylistController implements NetworkComponentAPI {
this.loadPlaylist(deliveryDirectives);
return;
}
} else {
} else if (details.canBlockReload) {
deliveryDirectives = this.getDeliveryDirectives(
details,
data.deliveryDirectives,
@ -228,9 +239,7 @@ export default class BasePlaylistController implements NetworkComponentAPI {
details,
distanceToLiveEdgeMs
);
if (!details.updated) {
this.requestScheduled = -1;
} else if (now > this.requestScheduled + reloadInterval) {
if (details.updated && now > this.requestScheduled + reloadInterval) {
this.requestScheduled = stats.loading.start;
}
@ -239,10 +248,13 @@ export default class BasePlaylistController implements NetworkComponentAPI {
stats.loading.first +
reloadInterval -
(details.partTarget * 1000 || 1000);
} else {
this.requestScheduled =
(this.requestScheduled === -1 ? now : this.requestScheduled) +
reloadInterval;
} else if (
this.requestScheduled === -1 ||
this.requestScheduled + reloadInterval < now
) {
this.requestScheduled = now;
} else if (this.requestScheduled - now <= 0) {
this.requestScheduled += reloadInterval;
}
let estimatedTimeUntilUpdate = this.requestScheduled - now;
estimatedTimeUntilUpdate = Math.max(0, estimatedTimeUntilUpdate);
@ -251,19 +263,21 @@ export default class BasePlaylistController implements NetworkComponentAPI {
estimatedTimeUntilUpdate
)} ms`
);
// this.log(
// `live reload ${details.updated ? 'REFRESHED' : 'MISSED'}
// this.log(
// `live reload ${details.updated ? 'REFRESHED' : 'MISSED'}
// reload in ${estimatedTimeUntilUpdate / 1000}
// round trip ${(stats.loading.end - stats.loading.start) / 1000}
// diff ${
// (reloadInterval -
// (estimatedTimeUntilUpdate + stats.loading.end - stats.loading.start)) /
// (estimatedTimeUntilUpdate +
// stats.loading.end -
// stats.loading.start)) /
// 1000
// }
// reload interval ${reloadInterval / 1000}
// target duration ${details.targetduration}
// distance to edge ${distanceToLiveEdgeMs / 1000}`
// );
// );
this.timer = self.setTimeout(
() => this.loadPlaylist(deliveryDirectives),
@ -289,39 +303,43 @@ export default class BasePlaylistController implements NetworkComponentAPI {
return new HlsUrlParameters(msn, part, skip);
}
protected retryLoadingOrFail(errorEvent: ErrorData): boolean {
const { config } = this.hls;
const retry = this.retryCount < config.levelLoadingMaxRetry;
protected checkRetry(errorEvent: ErrorData): boolean {
const errorDetails = errorEvent.details;
const isTimeout = isTimeoutError(errorEvent);
const errorAction = errorEvent.errorAction;
const { action, retryCount = 0, retryConfig } = errorAction || {};
const retry =
!!errorAction &&
!!retryConfig &&
(action === NetworkErrorAction.RetryRequest ||
(!errorAction.resolved &&
action === NetworkErrorAction.SendAlternateToPenaltyBox));
if (retry) {
this.requestScheduled = -1;
this.retryCount++;
if (
errorEvent.details.indexOf('LoadTimeOut') > -1 &&
errorEvent.context?.deliveryDirectives
) {
if (retryCount >= retryConfig.maxNumRetry) {
return false;
}
if (isTimeout && errorEvent.context?.deliveryDirectives) {
// The LL-HLS request already timed out so retry immediately
this.warn(
`retry playlist loading #${this.retryCount} after "${errorEvent.details}"`
`Retrying playlist loading ${retryCount + 1}/${
retryConfig.maxNumRetry
} after "${errorDetails}" without delivery-directives`
);
this.loadPlaylist();
} else {
// exponential backoff capped to max retry timeout
const delay = Math.min(
Math.pow(2, this.retryCount) * config.levelLoadingRetryDelay,
config.levelLoadingMaxRetryTimeout
);
const delay = getRetryDelay(retryConfig, retryCount);
// Schedule level/track reload
this.timer = self.setTimeout(() => this.loadPlaylist(), delay);
this.warn(
`retry playlist loading #${this.retryCount} in ${delay} ms after "${errorEvent.details}"`
`Retrying playlist loading ${retryCount + 1}/${
retryConfig.maxNumRetry
} after "${errorDetails}" in ${delay}ms`
);
}
} else {
this.warn(`cannot recover from error "${errorEvent.details}"`);
// stopping live reloading timer if any
this.clearTimer();
// switch error to fatal
errorEvent.fatal = true;
// `levelRetry = true` used to inform other controllers that a retry is happening
errorEvent.levelRetry = true;
errorAction.resolved = true;
}
return retry;
}

View file

@ -13,6 +13,7 @@ import {
findFragWithCC,
} from './fragment-finders';
import {
findPart,
getFragmentWithSN,
getPartWith,
updateFragPTSDTS,
@ -28,15 +29,17 @@ import { LevelDetails } from '../loader/level-details';
import Decrypter from '../crypt/decrypter';
import TimeRanges from '../utils/time-ranges';
import { PlaylistLevelType } from '../types/loader';
import { getRetryDelay } from '../utils/error-helper';
import { NetworkErrorAction } from './error-controller';
import type {
BufferAppendingData,
ErrorData,
FragLoadedData,
PartsLoadedData,
KeyLoadedData,
MediaAttachingData,
MediaAttachedData,
BufferFlushingData,
LevelSwitchingData,
ManifestLoadedData,
} from '../types/events';
import type { FragmentTracker } from './fragment-tracker';
import type { Level } from '../types/level';
@ -45,6 +48,7 @@ import type Hls from '../hls';
import type { HlsConfig } from '../config';
import type { NetworkComponentAPI } from '../types/component-api';
import type { SourceBufferName } from '../types/buffer';
import type { RationalTimestamp } from '../utils/timescale-conversion';
type ResolveFragLoaded = (FragLoadedEndData) => void;
type RejectFragLoaded = (LoadError) => void;
@ -75,6 +79,7 @@ export default class BaseStreamController
protected fragmentTracker: FragmentTracker;
protected transmuxer: TransmuxerInterface | null = null;
protected _state: string = State.STOPPED;
protected playlistType: PlaylistLevelType;
protected media: HTMLMediaElement | null = null;
protected mediaBuffer: Bufferable | null = null;
protected config: HlsConfig;
@ -82,8 +87,8 @@ export default class BaseStreamController
protected lastCurrentTime: number = 0;
protected nextLoadPosition: number = 0;
protected startPosition: number = 0;
protected startTimeOffset: number | null = null;
protected loadedmetadata: boolean = false;
protected fragLoadError: number = 0;
protected retryDate: number = 0;
protected levels: Array<Level> | null = null;
protected fragmentLoader: FragmentLoader;
@ -91,7 +96,7 @@ export default class BaseStreamController
protected levelLastLoaded: number | null = null;
protected startFragRequested: boolean = false;
protected decrypter: Decrypter;
protected initPTS: Array<number> = [];
protected initPTS: RationalTimestamp[] = [];
protected onvseeking: EventListener | null = null;
protected onvended: EventListener | null = null;
@ -103,9 +108,11 @@ export default class BaseStreamController
hls: Hls,
fragmentTracker: FragmentTracker,
keyLoader: KeyLoader,
logPrefix: string
logPrefix: string,
playlistType: PlaylistLevelType
) {
super();
this.playlistType = playlistType;
this.logPrefix = logPrefix;
this.log = logger.log.bind(logger, `${logPrefix}:`);
this.warn = logger.warn.bind(logger, `${logPrefix}:`);
@ -115,7 +122,7 @@ export default class BaseStreamController
this.fragmentTracker = fragmentTracker;
this.config = hls.config;
this.decrypter = new Decrypter(hls.config);
hls.on(Events.LEVEL_SWITCHING, this.onLevelSwitching, this);
hls.on(Events.MANIFEST_LOADED, this.onManifestLoaded, this);
}
protected doTick() {
@ -129,9 +136,9 @@ export default class BaseStreamController
public stopLoad() {
this.fragmentLoader.abort();
this.keyLoader.abort();
this.keyLoader.abort(this.playlistType);
const frag = this.fragCurrent;
if (frag) {
if (frag?.loader) {
frag.abortRequests();
this.fragmentTracker.removeFragment(frag);
}
@ -187,7 +194,7 @@ export default class BaseStreamController
protected onMediaAttached(
event: Events.MEDIA_ATTACHED,
data: MediaAttachingData
data: MediaAttachedData
) {
const media = (this.media = this.mediaBuffer = data.media);
this.onvseeking = this.onMediaSeeking.bind(this) as EventListener;
@ -259,13 +266,22 @@ export default class BaseStreamController
'seeking outside of buffer while fragment load in progress, cancel fragment load'
);
fragCurrent.abortRequests();
this.resetLoadingState();
}
this.resetLoadingState();
this.fragPrevious = null;
}
}
}
if (media) {
// Remove gap fragments
this.fragmentTracker.removeFragmentsInRange(
currentTime,
Infinity,
this.playlistType,
true
);
this.lastCurrentTime = currentTime;
}
@ -283,11 +299,12 @@ export default class BaseStreamController
this.startPosition = this.lastCurrentTime = 0;
}
protected onLevelSwitching(
event: Events.LEVEL_SWITCHING,
data: LevelSwitchingData
protected onManifestLoaded(
event: Events.MANIFEST_LOADED,
data: ManifestLoadedData
): void {
this.fragLoadError = 0;
this.startTimeOffset = data.startTimeOffset;
this.initPTS = [];
}
protected onHandlerDestroying() {
@ -297,7 +314,6 @@ export default class BaseStreamController
protected onHandlerDestroyed() {
this.state = State.STOPPED;
this.hls.off(Events.LEVEL_SWITCHING, this.onLevelSwitching, this);
if (this.fragmentLoader) {
this.fragmentLoader.destroy();
}
@ -321,15 +337,15 @@ export default class BaseStreamController
protected loadFragment(
frag: Fragment,
levelDetails: LevelDetails,
level: Level,
targetBufferTime: number
) {
this._loadFragForPlayback(frag, levelDetails, targetBufferTime);
this._loadFragForPlayback(frag, level, targetBufferTime);
}
private _loadFragForPlayback(
frag: Fragment,
levelDetails: LevelDetails,
level: Level,
targetBufferTime: number
) {
const progressCallback: FragmentLoadProgressCallback = (
@ -348,13 +364,12 @@ export default class BaseStreamController
this._handleFragmentLoadProgress(data);
};
this._doFragLoad(frag, levelDetails, targetBufferTime, progressCallback)
this._doFragLoad(frag, level, targetBufferTime, progressCallback)
.then((data) => {
if (!data) {
// if we're here we probably needed to backtrack or are waiting for more parts
return;
}
this.fragLoadError = 0;
const state = this.state;
if (this.fragContextChanged(frag)) {
if (
@ -384,6 +399,40 @@ export default class BaseStreamController
});
}
protected clearTrackerIfNeeded(frag: Fragment) {
const { fragmentTracker } = this;
const fragState = fragmentTracker.getState(frag);
if (fragState === FragmentState.APPENDING) {
// Lower the buffer size and try again
const playlistType = frag.type as PlaylistLevelType;
const bufferedInfo = this.getFwdBufferInfo(
this.mediaBuffer,
playlistType
);
const minForwardBufferLength = Math.max(
frag.duration,
bufferedInfo ? bufferedInfo.len : this.config.maxBufferLength
);
if (this.reduceMaxBufferLength(minForwardBufferLength)) {
fragmentTracker.removeFragment(frag);
}
} else if (this.mediaBuffer?.buffered.length === 0) {
// Stop gap for bad tracker / buffer flush behavior
fragmentTracker.removeAllFragments();
} else if (fragmentTracker.hasParts(frag.type)) {
// In low latency mode, remove fragments for which only some parts were buffered
fragmentTracker.detectPartialFragments({
frag,
part: null,
stats: frag.stats,
id: frag.type,
});
if (fragmentTracker.getState(frag) === FragmentState.PARTIAL) {
fragmentTracker.removeFragment(frag);
}
}
}
protected flushMainBuffer(
startOffset: number,
endOffset: number,
@ -395,13 +444,11 @@ export default class BaseStreamController
// When alternate audio is playing, the audio-stream-controller is responsible for the audio buffer. Otherwise,
// passing a null type flushes both buffers
const flushScope: BufferFlushingData = { startOffset, endOffset, type };
// Reset load errors on flush
this.fragLoadError = 0;
this.hls.trigger(Events.BUFFER_FLUSHING, flushScope);
}
protected _loadInitSegment(frag: Fragment, details: LevelDetails) {
this._doFragLoad(frag, details)
protected _loadInitSegment(frag: Fragment, level: Level) {
this._doFragLoad(frag, level)
.then((data) => {
if (!data || this.fragContextChanged(frag) || !this.levels) {
throw new Error('init load aborted');
@ -424,13 +471,24 @@ export default class BaseStreamController
decryptData.method === 'AES-128'
) {
const startTime = self.performance.now();
// decrypt the subtitles
// decrypt init segment data
return this.decrypter
.decrypt(
new Uint8Array(payload),
decryptData.key.buffer,
decryptData.iv.buffer
)
.catch((err) => {
hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.FRAG_DECRYPT_ERROR,
fatal: false,
error: err,
reason: err.message,
frag,
});
throw err;
})
.then((decryptedData) => {
const endTime = self.performance.now();
hls.trigger(Events.FRAG_DECRYPTED, {
@ -455,15 +513,9 @@ export default class BaseStreamController
throw new Error('init load aborted, missing levels');
}
const details = levels[frag.level].details as LevelDetails;
console.assert(
details,
'Level details are defined when init segment is loaded'
);
const stats = frag.stats;
this.state = State.IDLE;
this.fragLoadError = 0;
level.fragmentError = 0;
frag.data = new Uint8Array(data.payload);
stats.parsing.start = stats.buffering.start = self.performance.now();
stats.parsing.end = stats.buffering.end = self.performance.now();
@ -504,10 +556,10 @@ export default class BaseStreamController
this.log(
`Buffered ${frag.type} sn: ${frag.sn}${
part ? ' part: ' + part.index : ''
} of ${this.logPrefix === '[stream-controller]' ? 'level' : 'track'} ${
frag.level
} (frag:[${(frag.startPTS || NaN).toFixed(3)}-${(
frag.endPTS || NaN
} of ${
this.playlistType === PlaylistLevelType.MAIN ? 'level' : 'track'
} ${frag.level} (frag:[${(frag.startPTS ?? NaN).toFixed(3)}-${(
frag.endPTS ?? NaN
).toFixed(3)}] > buffer:${
media
? TimeRanges.toString(BufferHelper.getBuffered(media))
@ -561,12 +613,15 @@ export default class BaseStreamController
protected _doFragLoad(
frag: Fragment,
details: LevelDetails,
level: Level,
targetBufferTime: number | null = null,
progressCallback?: FragmentLoadProgressCallback
): Promise<PartsLoadedData | FragLoadedData | null> {
if (!this.levels) {
throw new Error('frag load aborted, missing levels');
const details = level?.details;
if (!this.levels || !details) {
throw new Error(
`frag load aborted, missing level${details ? '' : ' detail'}s`
);
}
let keyLoadingPromise: Promise<KeyLoadedData | void> | null = null;
@ -588,13 +643,17 @@ export default class BaseStreamController
}
});
this.hls.trigger(Events.KEY_LOADING, { frag });
this.throwIfFragContextChanged('KEY_LOADING');
if (this.fragCurrent === null) {
keyLoadingPromise = Promise.reject(
new Error(`frag load aborted, context changed in KEY_LOADING`)
);
}
} else if (!frag.encrypted && details.encryptedFragments.length) {
this.keyLoader.loadClear(frag, details.encryptedFragments);
}
targetBufferTime = Math.max(frag.start, targetBufferTime || 0);
if (this.config.lowLatencyMode && details) {
if (this.config.lowLatencyMode && frag.sn !== 'initSegment') {
const partList = details.partList;
if (partList && progressCallback) {
if (targetBufferTime > frag.end && details.fragmentHint) {
@ -616,14 +675,9 @@ export default class BaseStreamController
);
this.nextLoadPosition = part.start + part.duration;
this.state = State.FRAG_LOADING;
this.hls.trigger(Events.FRAG_LOADING, {
frag,
part: partList[partIndex],
targetBufferTime,
});
this.throwIfFragContextChanged('FRAG_LOADING parts');
let result: Promise<PartsLoadedData | FragLoadedData | null>;
if (keyLoadingPromise) {
return keyLoadingPromise
result = keyLoadingPromise
.then((keyLoadedData) => {
if (
!keyLoadedData ||
@ -633,20 +687,33 @@ export default class BaseStreamController
}
return this.doFragPartsLoad(
frag,
partList,
partIndex,
part,
level,
progressCallback
);
})
.catch((error) => this.handleFragLoadError(error));
} else {
result = this.doFragPartsLoad(
frag,
part,
level,
progressCallback
).catch((error: LoadError) => this.handleFragLoadError(error));
}
return this.doFragPartsLoad(
this.hls.trigger(Events.FRAG_LOADING, {
frag,
partList,
partIndex,
progressCallback
).catch((error: LoadError) => this.handleFragLoadError(error));
part,
targetBufferTime,
});
if (this.fragCurrent === null) {
return Promise.reject(
new Error(
`frag load aborted, context changed in FRAG_LOADING parts`
)
);
}
return result;
} else if (
!frag.url ||
this.loadedEndOfParts(partList, targetBufferTime)
@ -669,13 +736,12 @@ export default class BaseStreamController
this.nextLoadPosition = frag.start + frag.duration;
}
this.state = State.FRAG_LOADING;
this.hls.trigger(Events.FRAG_LOADING, { frag, targetBufferTime });
this.throwIfFragContextChanged('FRAG_LOADING');
// Load key before streaming fragment data
const dataOnProgress = this.config.progressive;
let result: Promise<PartsLoadedData | FragLoadedData | null>;
if (dataOnProgress && keyLoadingPromise) {
return keyLoadingPromise
result = keyLoadingPromise
.then((keyLoadedData) => {
if (!keyLoadedData || this.fragContextChanged(keyLoadedData?.frag)) {
return null;
@ -683,53 +749,55 @@ export default class BaseStreamController
return this.fragmentLoader.load(frag, progressCallback);
})
.catch((error) => this.handleFragLoadError(error));
} else {
// load unencrypted fragment data with progress event,
// or handle fragment result after key and fragment are finished loading
result = Promise.all([
this.fragmentLoader.load(
frag,
dataOnProgress ? progressCallback : undefined
),
keyLoadingPromise,
])
.then(([fragLoadedData]) => {
if (!dataOnProgress && fragLoadedData && progressCallback) {
progressCallback(fragLoadedData);
}
return fragLoadedData;
})
.catch((error) => this.handleFragLoadError(error));
}
// load unencrypted fragment data with progress event,
// or handle fragment result after key and fragment are finished loading
return Promise.all([
this.fragmentLoader.load(
frag,
dataOnProgress ? progressCallback : undefined
),
keyLoadingPromise,
])
.then(([fragLoadedData]) => {
if (!dataOnProgress && fragLoadedData && progressCallback) {
progressCallback(fragLoadedData);
}
return fragLoadedData;
})
.catch((error) => this.handleFragLoadError(error));
}
private throwIfFragContextChanged(context: string): void | never {
// exit if context changed during event loop
this.hls.trigger(Events.FRAG_LOADING, { frag, targetBufferTime });
if (this.fragCurrent === null) {
throw new Error(`frag load aborted, context changed in ${context}`);
return Promise.reject(
new Error(`frag load aborted, context changed in FRAG_LOADING`)
);
}
return result;
}
private doFragPartsLoad(
frag: Fragment,
partList: Part[],
partIndex: number,
fromPart: Part,
level: Level,
progressCallback: FragmentLoadProgressCallback
): Promise<PartsLoadedData | null> {
return new Promise(
(resolve: ResolveFragLoaded, reject: RejectFragLoaded) => {
const partsLoaded: FragLoadedData[] = [];
const loadPartIndex = (index: number) => {
const part = partList[index];
const initialPartList = level.details?.partList;
const loadPart = (part: Part) => {
this.fragmentLoader
.loadPart(frag, part, progressCallback)
.then((partLoadedData: FragLoadedData) => {
partsLoaded[part.index] = partLoadedData;
const loadedPart = partLoadedData.part as Part;
this.hls.trigger(Events.FRAG_LOADED, partLoadedData);
const nextPart = partList[index + 1];
if (nextPart && nextPart.fragment === frag) {
loadPartIndex(index + 1);
const nextPart =
getPartWith(level, frag.sn as number, part.index + 1) ||
findPart(initialPartList, frag.sn as number, part.index + 1);
if (nextPart) {
loadPart(nextPart);
} else {
return resolve({
frag,
@ -740,7 +808,7 @@ export default class BaseStreamController
})
.catch(reject);
};
loadPartIndex(partIndex);
loadPart(fromPart);
}
);
}
@ -758,6 +826,7 @@ export default class BaseStreamController
type: ErrorTypes.OTHER_ERROR,
details: ErrorDetails.INTERNAL_EXCEPTION,
err: error,
error,
fatal: true,
});
}
@ -788,9 +857,9 @@ export default class BaseStreamController
protected getCurrentContext(
chunkMeta: ChunkMetadata
): { frag: Fragment; part: Part | null; level: Level } | null {
const { levels } = this;
const { levels, fragCurrent } = this;
const { level: levelIndex, sn, part: partIndex } = chunkMeta;
if (!levels || !levels[levelIndex]) {
if (!levels?.[levelIndex]) {
this.warn(
`Levels object was unset while buffering fragment ${sn} of level ${levelIndex}. The current chunk will not be buffered.`
);
@ -800,10 +869,13 @@ export default class BaseStreamController
const part = partIndex > -1 ? getPartWith(level, sn, partIndex) : null;
const frag = part
? part.fragment
: getFragmentWithSN(level, sn, this.fragCurrent);
: getFragmentWithSN(level, sn, fragCurrent);
if (!frag) {
return null;
}
if (fragCurrent && fragCurrent !== frag) {
frag.stats = fragCurrent.stats;
}
return { frag, part, level };
}
@ -824,7 +896,7 @@ export default class BaseStreamController
buffer = appendUint8Array(data1, data2);
}
if (!buffer || !buffer.length) {
if (!buffer?.length) {
return;
}
@ -875,16 +947,22 @@ export default class BaseStreamController
bufferable: Bufferable | null,
type: PlaylistLevelType
): BufferInfo | null {
const { config } = this;
const pos = this.getLoadPosition();
if (!Number.isFinite(pos)) {
return null;
}
const bufferInfo = BufferHelper.bufferInfo(
bufferable,
pos,
config.maxBufferHole
);
return this.getFwdBufferInfoAtPos(bufferable, pos, type);
}
protected getFwdBufferInfoAtPos(
bufferable: Bufferable | null,
pos: number,
type: PlaylistLevelType
): BufferInfo | null {
const {
config: { maxBufferHole },
} = this;
const bufferInfo = BufferHelper.bufferInfo(bufferable, pos, maxBufferHole);
// Workaround flaw in getting forward buffer when maxBufferHole is smaller than gap at current pos
if (bufferInfo.len === 0 && bufferInfo.nextStart !== undefined) {
const bufferedFragAtPos = this.fragmentTracker.getBufferedFrag(pos, type);
@ -892,7 +970,7 @@ export default class BaseStreamController
return BufferHelper.bufferInfo(
bufferable,
pos,
Math.max(bufferInfo.nextStart, config.maxBufferHole)
Math.max(bufferInfo.nextStart, maxBufferHole)
);
}
}
@ -913,7 +991,7 @@ export default class BaseStreamController
return Math.min(maxBufLen, config.maxMaxBufferLength);
}
protected reduceMaxBufferLength(threshold?: number) {
protected reduceMaxBufferLength(threshold: number) {
const config = this.config;
const minLength = threshold || config.maxBufferLength;
if (config.maxMaxBufferLength >= minLength) {
@ -925,6 +1003,20 @@ export default class BaseStreamController
return false;
}
protected getAppendedFrag(
position: number,
playlistType: PlaylistLevelType = PlaylistLevelType.MAIN
): Fragment | null {
const fragOrPart = this.fragmentTracker.getAppendedFrag(
position,
PlaylistLevelType.MAIN
);
if (fragOrPart && 'fragment' in fragOrPart) {
return fragOrPart.fragment;
}
return fragOrPart;
}
protected getNextFragment(
pos: number,
levelDetails: LevelDetails
@ -979,6 +1071,52 @@ export default class BaseStreamController
return this.mapToInitFragWhenRequired(frag);
}
protected isLoopLoading(frag: Fragment, targetBufferTime: number): boolean {
const trackerState = this.fragmentTracker.getState(frag);
return (
(trackerState === FragmentState.OK ||
(trackerState === FragmentState.PARTIAL && !!frag.gap)) &&
this.nextLoadPosition > targetBufferTime
);
}
protected getNextFragmentLoopLoading(
frag: Fragment,
levelDetails: LevelDetails,
bufferInfo: BufferInfo,
playlistType: PlaylistLevelType,
maxBufLen: number
): Fragment | null {
const gapStart = frag.gap;
const nextFragment = this.getNextFragment(
this.nextLoadPosition,
levelDetails
);
if (nextFragment === null) {
return nextFragment;
}
frag = nextFragment;
if (gapStart && frag && !frag.gap && bufferInfo.nextStart) {
// Media buffered after GAP tags should not make the next buffer timerange exceed forward buffer length
const nextbufferInfo = this.getFwdBufferInfoAtPos(
this.mediaBuffer ? this.mediaBuffer : this.media,
bufferInfo.nextStart,
playlistType
);
if (
nextbufferInfo !== null &&
bufferInfo.len + nextbufferInfo.len >= maxBufLen
) {
// Returning here might result in not finding an audio and video candiate to skip to
this.log(
`buffer full after gaps in "${playlistType}" playlist starting at sn: ${frag.sn}`
);
return null;
}
}
return frag;
}
mapToInitFragWhenRequired(frag: Fragment | null): typeof frag {
// If an initSegment is present, it must be buffered first
if (frag?.initSegment && !frag?.initSegment.data && !this.bitrateTest) {
@ -1104,10 +1242,11 @@ export default class BaseStreamController
let { fragments, endSN } = levelDetails;
const { fragmentHint } = levelDetails;
const tolerance = config.maxFragLookUpTolerance;
const partList = levelDetails.partList;
const loadingParts = !!(
config.lowLatencyMode &&
levelDetails.partList &&
partList?.length &&
fragmentHint
);
if (loadingParts && fragmentHint && !this.bitrateTest) {
@ -1136,10 +1275,18 @@ export default class BaseStreamController
const curSNIdx = frag.sn - levelDetails.startSN;
// Move fragPrevious forward to support forcing the next fragment to load
// when the buffer catches up to a previously buffered range.
if (this.fragmentTracker.getState(frag) === FragmentState.OK) {
const fragState = this.fragmentTracker.getState(frag);
if (
fragState === FragmentState.OK ||
(fragState === FragmentState.PARTIAL && frag.gap)
) {
fragPrevious = frag;
}
if (fragPrevious && frag.sn === fragPrevious.sn && !loadingParts) {
if (
fragPrevious &&
frag.sn === fragPrevious.sn &&
(!loadingParts || partList[0].fragment.sn > frag.sn)
) {
// Force the next fragment to load if the previous one was already selected. This can occasionally happen with
// non-uniform fragment durations
const sameLevel = fragPrevious && frag.level === fragPrevious.level;
@ -1149,9 +1296,6 @@ export default class BaseStreamController
frag.sn < endSN &&
this.fragmentTracker.getState(nextFrag) !== FragmentState.OK
) {
this.log(
`SN ${frag.sn} just loaded, load next one: ${nextFrag.sn}`
);
frag = nextFrag;
} else {
frag = null;
@ -1259,9 +1403,13 @@ export default class BaseStreamController
startPosition = -1;
}
if (startPosition === -1 || this.lastCurrentTime === -1) {
// first, check if start time offset has been set in playlist, if yes, use this value
const startTimeOffset = details.startTimeOffset!;
if (Number.isFinite(startTimeOffset)) {
// Use Playlist EXT-X-START:TIME-OFFSET when set
// Prioritize Multivariant Playlist offset so that main, audio, and subtitle stream-controller start times match
const offsetInMultivariantPlaylist = this.startTimeOffset !== null;
const startTimeOffset = offsetInMultivariantPlaylist
? this.startTimeOffset
: details.startTimeOffset;
if (startTimeOffset !== null && Number.isFinite(startTimeOffset)) {
startPosition = sliding + startTimeOffset;
if (startTimeOffset < 0) {
startPosition += details.totalduration;
@ -1271,7 +1419,9 @@ export default class BaseStreamController
sliding + details.totalduration
);
this.log(
`Start time offset ${startTimeOffset} found in playlist, adjust startPosition to ${startPosition}`
`Start time offset ${startTimeOffset} found in ${
offsetInMultivariantPlaylist ? 'multivariant' : 'media'
} playlist, adjust startPosition to ${startPosition}`
);
this.startPosition = startPosition;
} else if (details.live) {
@ -1302,7 +1452,7 @@ export default class BaseStreamController
private handleFragLoadAborted(frag: Fragment, part: Part | undefined) {
if (this.transmuxer && frag.sn !== 'initSegment' && frag.stats.aborted) {
this.warn(
`Fragment ${frag.sn}${part ? ' part' + part.index : ''} of level ${
`Fragment ${frag.sn}${part ? ' part ' + part.index : ''} of level ${
frag.level
} was aborted`
);
@ -1324,69 +1474,115 @@ export default class BaseStreamController
filterType: PlaylistLevelType,
data: ErrorData
) {
if (data.fatal) {
this.stopLoad();
this.state = State.ERROR;
return;
}
const config = this.config;
if (data.chunkMeta) {
// Parsing Error: no retries
if (data.chunkMeta && !data.frag) {
const context = this.getCurrentContext(data.chunkMeta);
if (context) {
data.frag = context.frag;
data.levelRetry = true;
this.fragLoadError = config.fragLoadingMaxRetry;
}
}
const frag = data.frag;
// Handle frag error related to caller's filterType
if (!frag || frag.type !== filterType) {
if (!frag || frag.type !== filterType || !this.levels) {
return;
}
const fragCurrent = this.fragCurrent;
console.assert(
fragCurrent &&
frag.sn === fragCurrent.sn &&
frag.level === fragCurrent.level &&
frag.urlId === fragCurrent.urlId,
'Frag load error must match current frag to retry'
);
if (this.fragContextChanged(frag)) {
this.warn(
`Frag load error must match current frag to retry ${frag.url} > ${this.fragCurrent?.url}`
);
return;
}
const gapTagEncountered = data.details === ErrorDetails.FRAG_GAP;
if (gapTagEncountered) {
this.fragmentTracker.fragBuffered(frag, true);
}
// keep retrying until the limit will be reached
if (this.fragLoadError + 1 <= config.fragLoadingMaxRetry) {
const errorAction = data.errorAction;
const { action, retryCount = 0, retryConfig } = errorAction || {};
if (
errorAction &&
action === NetworkErrorAction.RetryRequest &&
retryConfig
) {
if (!this.loadedmetadata) {
this.startFragRequested = false;
this.nextLoadPosition = this.startPosition;
}
// exponential backoff capped to config.fragLoadingMaxRetryTimeout
const delay = Math.min(
Math.pow(2, this.fragLoadError) * config.fragLoadingRetryDelay,
config.fragLoadingMaxRetryTimeout
);
const delay = getRetryDelay(retryConfig, retryCount);
this.warn(
`Fragment ${frag.sn} of ${filterType} ${frag.level} failed to load, retrying in ${delay}ms`
`Fragment ${frag.sn} of ${filterType} ${frag.level} errored with ${
data.details
}, retrying loading ${retryCount + 1}/${
retryConfig.maxNumRetry
} in ${delay}ms`
);
errorAction.resolved = true;
this.retryDate = self.performance.now() + delay;
this.fragLoadError++;
this.state = State.FRAG_LOADING_WAITING_RETRY;
} else if (data.levelRetry) {
if (filterType === PlaylistLevelType.AUDIO) {
// Reset current fragment since audio track audio is essential and may not have a fail-over track
this.fragCurrent = null;
} else if (retryConfig && errorAction) {
this.resetFragmentErrors(filterType);
if (retryCount < retryConfig.maxNumRetry) {
// Network retry is skipped when level switch is preferred
if (!gapTagEncountered) {
errorAction.resolved = true;
}
} else {
logger.warn(
`${data.details} reached or exceeded max retry (${retryCount})`
);
}
// Fragment errors that result in a level switch or redundant fail-over
// should reset the stream controller state to idle
this.fragLoadError = 0;
this.state = State.IDLE;
} else {
logger.error(
`${data.details} reaches max retry, redispatch as fatal ...`
);
// switch error to fatal
data.fatal = true;
this.hls.stopLoad();
this.state = State.ERROR;
}
// Perform next async tick sooner to speed up error action resolution
this.tickImmediate();
}
protected reduceLengthAndFlushBuffer(data: ErrorData): boolean {
// if in appending state
if (this.state === State.PARSING || this.state === State.PARSED) {
const playlistType = data.parent as PlaylistLevelType;
const bufferedInfo = this.getFwdBufferInfo(
this.mediaBuffer,
playlistType
);
// 0.5 : tolerance needed as some browsers stalls playback before reaching buffered end
// reduce max buf len if current position is buffered
const buffered = bufferedInfo && bufferedInfo.len > 0.5;
if (buffered) {
this.reduceMaxBufferLength(bufferedInfo.len);
}
const flushBuffer = !buffered;
if (flushBuffer) {
// current position is not buffered, but browser is still complaining about buffer full error
// this happens on IE/Edge, refer to https://github.com/video-dev/hls.js/pull/708
// in that case flush the whole audio buffer to recover
this.warn(
`Buffer full error while media.currentTime is not buffered, flush ${playlistType} buffer`
);
}
if (data.frag) {
this.fragmentTracker.removeFragment(data.frag);
this.nextLoadPosition = data.frag.start;
}
this.resetLoadingState();
return flushBuffer;
}
return false;
}
protected resetFragmentErrors(filterType: PlaylistLevelType) {
if (filterType === PlaylistLevelType.AUDIO) {
// Reset current fragment since audio track audio is essential and may not have a fail-over track
this.fragCurrent = null;
}
// Fragment errors that result in a level switch or redundant fail-over
// should reset the stream controller state to idle
if (!this.loadedmetadata) {
this.startFragRequested = false;
}
if (this.state !== State.STOPPED) {
this.state = State.IDLE;
}
}
protected afterBufferFlushed(
@ -1434,6 +1630,25 @@ export default class BaseStreamController
}
}
protected resetWhenMissingContext(chunkMeta: ChunkMetadata) {
this.warn(
`The loading context changed while buffering fragment ${chunkMeta.sn} of level ${chunkMeta.level}. This chunk will not be buffered.`
);
this.removeUnbufferedFrags();
this.resetStartWhenNotLoaded(chunkMeta.level);
this.resetLoadingState();
}
protected removeUnbufferedFrags(start: number = 0) {
this.fragmentTracker.removeFragmentsInRange(
start,
Infinity,
this.playlistType,
false,
true
);
}
private updateLevelTiming(
frag: Fragment,
part: Part | null,
@ -1441,7 +1656,10 @@ export default class BaseStreamController
partial: boolean
) {
const details = level.details as LevelDetails;
console.assert(!!details, 'level.details must be defined');
if (!details) {
this.warn('level.details undefined');
return;
}
const parsed = Object.keys(frag.elementaryStreams).reduce(
(result, type) => {
const info = frag.elementaryStreams[type];
@ -1481,11 +1699,26 @@ export default class BaseStreamController
},
false
);
if (!parsed) {
this.warn(
if (parsed) {
level.fragmentError = 0;
} else if (this.transmuxer?.error === null) {
const error = new Error(
`Found no media in fragment ${frag.sn} of level ${level.id} resetting transmuxer to fallback to playlist timing`
);
this.warn(error.message);
this.hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.FRAG_PARSING_ERROR,
fatal: false,
error,
frag,
reason: `Found no media in msn ${frag.sn} of level "${level.url}"`,
});
if (!this.hls) {
return;
}
this.resetTransmuxer();
// For this error fallthrough. Marking parsed will allow advancing to next fragment.
}
this.state = State.PARSED;
this.hls.trigger(Events.FRAG_PARSED, { frag, part });
@ -1498,6 +1731,13 @@ export default class BaseStreamController
}
}
protected recoverWorkerError(data: ErrorData) {
if (data.event === 'demuxerWorker') {
this.resetTransmuxer();
this.resetLoadingState();
}
}
set state(nextState) {
const previousState = this._state;
if (previousState !== nextState) {

View file

@ -88,6 +88,7 @@ export default class BufferController implements ComponentAPI {
const { hls } = this;
hls.on(Events.MEDIA_ATTACHING, this.onMediaAttaching, this);
hls.on(Events.MEDIA_DETACHING, this.onMediaDetaching, this);
hls.on(Events.MANIFEST_LOADING, this.onManifestLoading, this);
hls.on(Events.MANIFEST_PARSED, this.onManifestParsed, this);
hls.on(Events.BUFFER_RESET, this.onBufferReset, this);
hls.on(Events.BUFFER_APPENDING, this.onBufferAppending, this);
@ -103,6 +104,7 @@ export default class BufferController implements ComponentAPI {
const { hls } = this;
hls.off(Events.MEDIA_ATTACHING, this.onMediaAttaching, this);
hls.off(Events.MEDIA_DETACHING, this.onMediaDetaching, this);
hls.off(Events.MANIFEST_LOADING, this.onManifestLoading, this);
hls.off(Events.MANIFEST_PARSED, this.onManifestParsed, this);
hls.off(Events.BUFFER_RESET, this.onBufferReset, this);
hls.off(Events.BUFFER_APPENDING, this.onBufferAppending, this);
@ -125,6 +127,11 @@ export default class BufferController implements ComponentAPI {
this.lastMpegAudioChunk = null;
}
private onManifestLoading() {
this.bufferCodecEventsExpected = this._bufferCodecEventsTotal = 0;
this.details = null;
}
protected onManifestParsed(
event: Events.MANIFEST_PARSED,
data: ManifestParsedData
@ -134,11 +141,10 @@ export default class BufferController implements ComponentAPI {
// in case alt audio is not used, only one BUFFER_CODEC event will be fired from main stream controller
// it will contain the expected nb of source buffers, no need to compute it
let codecEvents: number = 2;
if ((data.audio && !data.video) || !data.altAudio) {
if ((data.audio && !data.video) || !data.altAudio || !__USE_ALT_AUDIO__) {
codecEvents = 1;
}
this.bufferCodecEventsExpected = this._bufferCodecEventsTotal = codecEvents;
this.details = null;
logger.log(
`${this.bufferCodecEventsExpected} bufferCodec event(s) expected`
);
@ -414,6 +420,10 @@ export default class BufferController implements ComponentAPI {
type: ErrorTypes.MEDIA_ERROR,
parent: frag.type,
details: ErrorDetails.BUFFER_APPEND_ERROR,
frag,
part,
chunkMeta,
error: err,
err,
fatal: false,
};
@ -433,7 +443,6 @@ export default class BufferController implements ComponentAPI {
`[buffer-controller]: Failed ${hls.config.appendErrorMaxRetry} times to append segment in sourceBuffer`
);
event.fatal = true;
hls.stopLoad();
}
}
hls.trigger(Events.ERROR, event);
@ -717,18 +726,23 @@ export default class BufferController implements ComponentAPI {
this.pendingTracks = {};
// append any pending segments now !
const buffers = this.getSourceBufferTypes();
if (buffers.length === 0) {
if (buffers.length) {
this.hls.trigger(Events.BUFFER_CREATED, { tracks: this.tracks });
buffers.forEach((type: SourceBufferName) => {
operationQueue.executeNext(type);
});
} else {
const error = new Error(
'could not create source buffer for media codec(s)'
);
this.hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.BUFFER_INCOMPATIBLE_CODECS_ERROR,
fatal: true,
reason: 'could not create source buffer for media codec(s)',
error,
reason: error.message,
});
return;
}
buffers.forEach((type: SourceBufferName) => {
operationQueue.executeNext(type);
});
}
}
@ -737,7 +751,6 @@ export default class BufferController implements ComponentAPI {
if (!mediaSource) {
throw Error('createSourceBuffers called when mediaSource was null');
}
let tracksCreated = 0;
for (const trackName in tracks) {
if (!sourceBuffer[trackName]) {
const track = tracks[trackName as keyof TrackSet];
@ -765,7 +778,6 @@ export default class BufferController implements ComponentAPI {
metadata: track.metadata,
id: track.id,
};
tracksCreated++;
} catch (err) {
logger.error(
`[buffer-controller]: error while trying to add sourceBuffer: ${err.message}`
@ -780,9 +792,6 @@ export default class BufferController implements ComponentAPI {
}
}
}
if (tracksCreated) {
this.hls.trigger(Events.BUFFER_CREATED, { tracks: this.tracks });
}
}
// Keep as arrow functions so that we can directly reference these functions directly as event listeners
@ -833,12 +842,14 @@ export default class BufferController implements ComponentAPI {
}
private _onSBUpdateError(type: SourceBufferName, event: Event) {
logger.error(`[buffer-controller]: ${type} SourceBuffer error`, event);
const error = new Error(`${type} SourceBuffer error`);
logger.error(`[buffer-controller]: ${error}`, event);
// according to http://www.w3.org/TR/media-source/#sourcebuffer-append-error
// SourceBuffer errors are not necessarily fatal; if so, the HTMLMediaElement will fire an error event
this.hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.BUFFER_APPENDING_ERROR,
error,
fatal: false,
});
// updateend is always fired after error, so we'll allow that to shift the current operation off of the queue
@ -876,7 +887,6 @@ export default class BufferController implements ComponentAPI {
logger.log(
`[buffer-controller]: Removing [${removeStart},${removeEnd}] from the ${type} SourceBuffer`
);
console.assert(!sb.updating, `${type} sourceBuffer must not be updating`);
sb.remove(removeStart, removeEnd);
} else {
// Cycle the queue
@ -897,7 +907,6 @@ export default class BufferController implements ComponentAPI {
}
sb.ended = false;
console.assert(!sb.updating, `${type} sourceBuffer must not be updating`);
sb.appendBuffer(data);
}
@ -929,7 +938,7 @@ export default class BufferController implements ComponentAPI {
// Only cycle the queue if the SB is not updating. There's a bug in Chrome which sets the SB updating flag to
// true when changing the MediaSource duration (https://bugs.chromium.org/p/chromium/issues/detail?id=959359&can=2&q=mediasource%20duration)
// While this is a workaround, it's probably useful to have around
if (!sb || !sb.updating) {
if (!sb?.updating) {
operationQueue.shiftAndExecuteNext(type);
}
});

View file

@ -65,7 +65,7 @@ export default class BufferOperationQueue {
operation.onError(e);
// Only shift the current operation off, otherwise the updateend handler will do this for us
if (!sb || !sb.updating) {
if (!sb?.updating) {
queue.shift();
this.executeNext(type);
}

View file

@ -14,16 +14,16 @@ import StreamController from './stream-controller';
import type { ComponentAPI } from '../types/component-api';
import type Hls from '../hls';
type RestrictedLevel = { width: number; height: number; bitrate: number };
class CapLevelController implements ComponentAPI {
public autoLevelCapping: number;
public firstLevel: number;
public media: HTMLVideoElement | null;
public restrictedLevels: Array<number>;
public timer: number | undefined;
private hls: Hls;
private autoLevelCapping: number;
private firstLevel: number;
private media: HTMLVideoElement | null;
private restrictedLevels: RestrictedLevel[];
private timer: number | undefined;
private clientRect: { width: number; height: number } | null;
private streamController?: StreamController;
public clientRect: { width: number; height: number } | null;
constructor(hls: Hls) {
this.hls = hls;
@ -75,13 +75,13 @@ class CapLevelController implements ComponentAPI {
data: FPSDropLevelCappingData
) {
// Don't add a restricted level more than once
if (
CapLevelController.isLevelAllowed(
data.droppedLevel,
this.restrictedLevels
)
) {
this.restrictedLevels.push(data.droppedLevel);
const level = this.hls.levels[data.droppedLevel];
if (this.isLevelAllowed(level)) {
this.restrictedLevels.push({
bitrate: level.bitrate,
height: level.height,
width: level.width,
});
}
}
@ -152,9 +152,7 @@ class CapLevelController implements ComponentAPI {
}
const validLevels = levels.filter(
(level, index) =>
CapLevelController.isLevelAllowed(index, this.restrictedLevels) &&
index <= capLevelIndex
(level, index) => this.isLevelAllowed(level) && index <= capLevelIndex
);
this.clientRect = null;
@ -235,11 +233,15 @@ class CapLevelController implements ComponentAPI {
return pixelRatio;
}
static isLevelAllowed(
level: number,
restrictedLevels: Array<number> = []
): boolean {
return restrictedLevels.indexOf(level) === -1;
private isLevelAllowed(level: Level): boolean {
const restrictedLevels = this.restrictedLevels;
return !restrictedLevels.some((restrictedLevel) => {
return (
level.bitrate === restrictedLevel.bitrate &&
level.width === restrictedLevel.width &&
level.height === restrictedLevel.height
);
});
}
static getMaxLevelByMediaSize(
@ -247,7 +249,7 @@ class CapLevelController implements ComponentAPI {
width: number,
height: number
): number {
if (!levels || !levels.length) {
if (!levels?.length) {
return -1;
}

View file

@ -1,20 +1,18 @@
import {
FragmentLoaderConstructor,
HlsConfig,
PlaylistLoaderConstructor,
} from '../config';
import { Events } from '../events';
import Hls, { Fragment } from '../hls';
import Hls from '../hls';
import {
CMCD,
CMCDHeaders,
CMCDObjectType,
CMCDStreamingFormat,
CMCDStreamingFormatHLS,
CMCDVersion,
} from '../types/cmcd';
import { ComponentAPI } from '../types/component-api';
import { BufferCreatedData, MediaAttachedData } from '../types/events';
import {
import { BufferHelper } from '../utils/buffer-helper';
import { logger } from '../utils/logger';
import type { ComponentAPI } from '../types/component-api';
import type { Fragment } from '../loader/fragment';
import type { BufferCreatedData, MediaAttachedData } from '../types/events';
import type {
FragmentLoaderContext,
Loader,
LoaderCallbacks,
@ -22,8 +20,11 @@ import {
LoaderContext,
PlaylistLoaderContext,
} from '../types/loader';
import { BufferHelper } from '../utils/buffer-helper';
import { logger } from '../utils/logger';
import type {
FragmentLoaderConstructor,
HlsConfig,
PlaylistLoaderConstructor,
} from '../config';
/**
* Controller to deal with Common Media Client Data (CMCD)
@ -70,12 +71,11 @@ export default class CMCDController implements ComponentAPI {
hls.off(Events.MEDIA_ATTACHED, this.onMediaAttached, this);
hls.off(Events.MEDIA_DETACHED, this.onMediaDetached, this);
hls.off(Events.BUFFER_CREATED, this.onBufferCreated, this);
this.onMediaDetached();
}
destroy() {
this.unregisterListeners();
this.onMediaDetached();
// @ts-ignore
this.hls = this.config = this.audioBuffer = this.videoBuffer = null;
@ -132,7 +132,7 @@ export default class CMCDController implements ComponentAPI {
private createData(): CMCD {
return {
v: CMCDVersion,
sf: CMCDStreamingFormat.HLS,
sf: CMCDStreamingFormatHLS,
sid: this.sid,
cid: this.cid,
pr: this.media?.playbackRate,

View file

@ -24,7 +24,7 @@ import { base64Decode } from '../utils/numeric-encoding-utils';
import { DecryptData, LevelKey } from '../loader/level-key';
import Hex from '../utils/hex';
import { bin2str, parsePssh, parseSinf } from '../utils/mp4-tools';
import EventEmitter from 'eventemitter3';
import { EventEmitter } from 'eventemitter3';
import type Hls from '../hls';
import type { ComponentAPI } from '../types/component-api';
import type {
@ -33,10 +33,15 @@ import type {
ErrorData,
ManifestLoadedData,
} from '../types/events';
import type { EMEControllerConfig } from '../config';
import type { EMEControllerConfig, HlsConfig, LoadPolicy } from '../config';
import type { Fragment } from '../loader/fragment';
import type {
Loader,
LoaderCallbacks,
LoaderConfiguration,
LoaderContext,
} from '../types/loader';
const MAX_LICENSE_REQUEST_FAILURES = 3;
const LOGGER_PREFIX = '[eme]';
interface KeySystemAccessPromises {
@ -65,7 +70,11 @@ class EMEController implements ComponentAPI {
public static CDMCleanupPromise: Promise<void> | void;
private readonly hls: Hls;
private readonly config: EMEControllerConfig;
private readonly config: EMEControllerConfig & {
loader: { new (confg: HlsConfig): Loader<LoaderContext> };
certLoadPolicy: LoadPolicy;
keyLoadPolicy: LoadPolicy;
};
private media: HTMLMediaElement | null = null;
private keyFormatPromise: Promise<KeySystemFormats> | null = null;
private keySystemAccessPromises: {
@ -96,23 +105,32 @@ class EMEController implements ComponentAPI {
public destroy() {
this.unregisterListeners();
this.onMediaDetached();
// Remove any references that could be held in config options or callbacks
const config = this.config;
config.requestMediaKeySystemAccessFunc = null;
config.licenseXhrSetup = config.licenseResponseCallback = undefined;
config.drmSystems = config.drmSystemOptions = {};
// @ts-ignore
this.hls =
this.onMediaEncrypted =
this.onWaitingForKey =
this.keyIdToKeySessionPromise =
null as any;
// @ts-ignore
this.config = null;
}
private registerListeners() {
this.hls.on(Events.MEDIA_ATTACHED, this.onMediaAttached, this);
this.hls.on(Events.MEDIA_DETACHED, this.onMediaDetached, this);
this.hls.on(Events.MANIFEST_LOADING, this.onManifestLoading, this);
this.hls.on(Events.MANIFEST_LOADED, this.onManifestLoaded, this);
}
private unregisterListeners() {
this.hls.off(Events.MEDIA_ATTACHED, this.onMediaAttached, this);
this.hls.off(Events.MEDIA_DETACHED, this.onMediaDetached, this);
this.hls.off(Events.MANIFEST_LOADING, this.onManifestLoading, this);
this.hls.off(Events.MANIFEST_LOADED, this.onManifestLoaded, this);
}
@ -296,8 +314,6 @@ class EMEController implements ComponentAPI {
keySystem: KeySystems;
mediaKeys: MediaKeys;
}): MediaKeySessionContext {
console.assert(!!mediaKeys, 'mediaKeys is defined');
this.log(
`Creating key-system session "${keySystem}" keyId: ${Hex.hexDump(
decryptdata.keyId! || []
@ -834,36 +850,73 @@ class EMEController implements ComponentAPI {
private fetchServerCertificate(
keySystem: KeySystems
): Promise<BufferSource | void> {
const config = this.config;
const Loader = config.loader;
const certLoader = new Loader(config as HlsConfig) as Loader<LoaderContext>;
const url = this.getServerCertificateUrl(keySystem);
if (!url) {
return Promise.resolve();
}
this.log(`Fetching serverCertificate for "${keySystem}"`);
return new Promise((resolve, reject) => {
const url = this.getServerCertificateUrl(keySystem);
if (!url) {
return resolve();
}
this.log(`Fetching serverCertificate for "${keySystem}"`);
const xhr = new XMLHttpRequest();
xhr.open('GET', url, true);
xhr.responseType = 'arraybuffer';
xhr.onreadystatechange = () => {
if (xhr.readyState === XMLHttpRequest.DONE) {
if (xhr.status === 200) {
resolve(xhr.response);
} else {
reject(
new EMEKeyError(
{
type: ErrorTypes.KEY_SYSTEM_ERROR,
details:
ErrorDetails.KEY_SYSTEM_SERVER_CERTIFICATE_REQUEST_FAILED,
fatal: true,
networkDetails: xhr,
},
`"${keySystem}" certificate request XHR failed (${url}). Status: ${xhr.status} (${xhr.statusText})`
)
);
}
}
const loaderContext: LoaderContext = {
responseType: 'arraybuffer',
url,
};
xhr.send();
const loadPolicy = config.certLoadPolicy.default;
const loaderConfig: LoaderConfiguration = {
loadPolicy,
timeout: loadPolicy.maxLoadTimeMs,
maxRetry: 0,
retryDelay: 0,
maxRetryDelay: 0,
};
const loaderCallbacks: LoaderCallbacks<LoaderContext> = {
onSuccess: (response, stats, context, networkDetails) => {
resolve(response.data as ArrayBuffer);
},
onError: (response, contex, networkDetails, stats) => {
reject(
new EMEKeyError(
{
type: ErrorTypes.KEY_SYSTEM_ERROR,
details:
ErrorDetails.KEY_SYSTEM_SERVER_CERTIFICATE_REQUEST_FAILED,
fatal: true,
networkDetails,
response: {
url: loaderContext.url,
data: undefined,
...response,
},
},
`"${keySystem}" certificate request failed (${url}). Status: ${response.code} (${response.text})`
)
);
},
onTimeout: (stats, context, networkDetails) => {
reject(
new EMEKeyError(
{
type: ErrorTypes.KEY_SYSTEM_ERROR,
details:
ErrorDetails.KEY_SYSTEM_SERVER_CERTIFICATE_REQUEST_FAILED,
fatal: true,
networkDetails,
response: {
url: loaderContext.url,
data: undefined,
},
},
`"${keySystem}" certificate request timed out (${url})`
)
);
},
onAbort: (stats, context, networkDetails) => {
reject(new Error('aborted'));
},
};
certLoader.load(loaderContext, loaderConfig, loaderCallbacks);
});
}
@ -982,6 +1035,7 @@ class EMEController implements ComponentAPI {
keySessionContext: MediaKeySessionContext,
licenseChallenge: Uint8Array
): Promise<ArrayBuffer> {
const keyLoadPolicy = this.config.keyLoadPolicy.default;
return new Promise((resolve, reject) => {
const url = this.getLicenseServerUrl(keySessionContext.keySystem);
this.log(`Sending license request to URL: ${url}`);
@ -1015,9 +1069,11 @@ class EMEController implements ComponentAPI {
}
resolve(data);
} else {
const retryConfig = keyLoadPolicy.errorRetry;
const maxNumRetry = retryConfig ? retryConfig.maxNumRetry : 0;
this._requestLicenseFailureCount++;
if (
this._requestLicenseFailureCount > MAX_LICENSE_REQUEST_FAILURES ||
this._requestLicenseFailureCount > maxNumRetry ||
(xhr.status >= 400 && xhr.status < 500)
) {
reject(
@ -1027,15 +1083,19 @@ class EMEController implements ComponentAPI {
details: ErrorDetails.KEY_SYSTEM_LICENSE_REQUEST_FAILED,
fatal: true,
networkDetails: xhr,
response: {
url,
data: undefined as any,
code: xhr.status,
text: xhr.statusText,
},
},
`License Request XHR failed (${url}). Status: ${xhr.status} (${xhr.statusText})`
)
);
} else {
const attemptsLeft =
MAX_LICENSE_REQUEST_FAILURES -
this._requestLicenseFailureCount +
1;
maxNumRetry - this._requestLicenseFailureCount + 1;
this.warn(
`Retrying license request, ${attemptsLeft} attempts left`
);
@ -1123,6 +1183,10 @@ class EMEController implements ComponentAPI {
});
}
private onManifestLoading() {
this.keyFormatPromise = null;
}
private onManifestLoaded(
event: Events.MANIFEST_LOADED,
{ sessionKeys }: ManifestLoadedData
@ -1187,9 +1251,13 @@ class EMEController implements ComponentAPI {
class EMEKeyError extends Error {
public readonly data: ErrorData;
constructor(data: ErrorData, message: string) {
constructor(
data: Omit<ErrorData, 'error'> & { error?: Error },
message: string
) {
super(message);
this.data = data;
data.error ||= new Error(message);
this.data = data as ErrorData;
data.err = data.error;
}
}

View file

@ -31,7 +31,7 @@ class FPSController implements ComponentAPI {
}
protected unregisterListeners() {
this.hls.off(Events.MEDIA_ATTACHING, this.onMediaAttaching);
this.hls.off(Events.MEDIA_ATTACHING, this.onMediaAttaching, this);
}
destroy() {

View file

@ -2,11 +2,10 @@ import BinarySearch from '../utils/binary-search';
import { Fragment } from '../loader/fragment';
/**
* Returns first fragment whose endPdt value exceeds the given PDT.
* @param {Array<Fragment>} fragments - The array of candidate fragments
* @param {number|null} [PDTValue = null] - The PDT value which must be exceeded
* @param {number} [maxFragLookUpTolerance = 0] - The amount of time that a fragment's start/end can be within in order to be considered contiguous
* @returns {*|null} fragment - The best matching fragment
* Returns first fragment whose endPdt value exceeds the given PDT, or null.
* @param fragments - The array of candidate fragments
* @param PDTValue - The PDT value which must be exceeded
* @param maxFragLookUpTolerance - The amount of time that a fragment's start/end can be within in order to be considered contiguous
*/
export function findFragmentByPDT(
fragments: Array<Fragment>,
@ -48,11 +47,11 @@ export function findFragmentByPDT(
* Finds a fragment based on the SN of the previous fragment; or based on the needs of the current buffer.
* This method compensates for small buffer gaps by applying a tolerance to the start of any candidate fragment, thus
* breaking any traps which would cause the same fragment to be continuously selected within a small range.
* @param {*} fragPrevious - The last frag successfully appended
* @param {Array} fragments - The array of candidate fragments
* @param {number} [bufferEnd = 0] - The end of the contiguous buffered range the playhead is currently within
* @param {number} maxFragLookUpTolerance - The amount of time that a fragment's start/end can be within in order to be considered contiguous
* @returns {*} foundFrag - The best matching fragment
* @param fragPrevious - The last frag successfully appended
* @param fragments - The array of candidate fragments
* @param bufferEnd - The end of the contiguous buffered range the playhead is currently within
* @param maxFragLookUpTolerance - The amount of time that a fragment's start/end can be within in order to be considered contiguous
* @returns a matching fragment or null
*/
export function findFragmentByPTS(
fragPrevious: Fragment | null,
@ -91,10 +90,10 @@ export function findFragmentByPTS(
/**
* The test function used by the findFragmentBySn's BinarySearch to look for the best match to the current buffer conditions.
* @param {*} candidate - The fragment to test
* @param {number} [bufferEnd = 0] - The end of the current buffered range the playhead is currently within
* @param {number} [maxFragLookUpTolerance = 0] - The amount of time that a fragment's start can be within in order to be considered contiguous
* @returns {number} - 0 if it matches, 1 if too low, -1 if too high
* @param candidate - The fragment to test
* @param bufferEnd - The end of the current buffered range the playhead is currently within
* @param maxFragLookUpTolerance - The amount of time that a fragment's start can be within in order to be considered contiguous
* @returns 0 if it matches, 1 if too low, -1 if too high
*/
export function fragmentWithinToleranceTest(
bufferEnd = 0,
@ -145,10 +144,10 @@ export function fragmentWithinToleranceTest(
/**
* The test function used by the findFragmentByPdt's BinarySearch to look for the best match to the current buffer conditions.
* This function tests the candidate's program date time values, as represented in Unix time
* @param {*} candidate - The fragment to test
* @param {number} [pdtBufferEnd = 0] - The Unix time representing the end of the current buffered range
* @param {number} [maxFragLookUpTolerance = 0] - The amount of time that a fragment's start can be within in order to be considered contiguous
* @returns {boolean} True if contiguous, false otherwise
* @param candidate - The fragment to test
* @param pdtBufferEnd - The Unix time representing the end of the current buffered range
* @param maxFragLookUpTolerance - The amount of time that a fragment's start can be within in order to be considered contiguous
* @returns true if contiguous, false otherwise
*/
export function pdtWithinToleranceTest(
pdtBufferEnd: number,

View file

@ -15,7 +15,7 @@ import type {
} from '../types/events';
import type Hls from '../hls';
export enum FragmentState {
export const enum FragmentState {
NOT_LOADED = 'NOT_LOADED',
APPENDING = 'APPENDING',
PARTIAL = 'PARTIAL',
@ -23,8 +23,8 @@ export enum FragmentState {
}
export class FragmentTracker implements ComponentAPI {
private activeFragment: Fragment | null = null;
private activeParts: Part[] | null = null;
private activePartLists: { [key in PlaylistLevelType]?: Part[] } =
Object.create(null);
private endListFragments: { [key in PlaylistLevelType]?: FragmentEntity } =
Object.create(null);
private fragments: Partial<Record<string, FragmentEntity>> =
@ -37,6 +37,7 @@ export class FragmentTracker implements ComponentAPI {
private bufferPadding: number = 0.2;
private hls: Hls;
private hasGaps: boolean = false;
constructor(hls: Hls) {
this.hls = hls;
@ -62,51 +63,37 @@ export class FragmentTracker implements ComponentAPI {
this._unregisterListeners();
// @ts-ignore
this.fragments =
// @ts-ignore
this.activePartLists =
// @ts-ignore
this.endListFragments =
this.timeRanges =
this.activeFragment =
this.activeParts =
null;
}
/**
* Return a Fragment with an appended range that matches the position and levelType.
* If not found any Fragment, return null
* Return a Fragment or Part with an appended range that matches the position and levelType
* Otherwise, return null
*/
public getAppendedFrag(
position: number,
levelType: PlaylistLevelType
): Fragment | Part | null {
if (levelType === PlaylistLevelType.MAIN) {
const { activeFragment, activeParts } = this;
if (!activeFragment) {
return null;
}
if (activeParts) {
for (let i = activeParts.length; i--; ) {
const activePart = activeParts[i];
const appendedPTS = activePart
? activePart.end
: activeFragment.appendedPTS;
if (
activePart.start <= position &&
appendedPTS !== undefined &&
position <= appendedPTS
) {
// 9 is a magic number. remove parts from lookup after a match but keep some short seeks back.
if (i > 9) {
this.activeParts = activeParts.slice(i - 9);
}
return activePart;
}
const activeParts = this.activePartLists[levelType];
if (activeParts) {
for (let i = activeParts.length; i--; ) {
const activePart = activeParts[i];
if (!activePart) {
break;
}
const appendedPTS = activePart.end;
if (
activePart.start <= position &&
appendedPTS !== null &&
position <= appendedPTS
) {
return activePart;
}
} else if (
activeFragment.start <= position &&
activeFragment.appendedPTS !== undefined &&
position <= activeFragment.appendedPTS
) {
return activeFragment;
}
}
return this.getBufferedFrag(position, levelType);
@ -143,17 +130,23 @@ export class FragmentTracker implements ComponentAPI {
public detectEvictedFragments(
elementaryStream: SourceBufferName,
timeRange: TimeRanges,
playlistType?: PlaylistLevelType
playlistType: PlaylistLevelType,
appendedPart?: Part | null
) {
if (this.timeRanges) {
this.timeRanges[elementaryStream] = timeRange;
}
// Check if any flagged fragments have been unloaded
// excluding anything newer than appendedPartSn
const appendedPartSn = (appendedPart?.fragment.sn || -1) as number;
Object.keys(this.fragments).forEach((key) => {
const fragmentEntity = this.fragments[key];
if (!fragmentEntity) {
return;
}
if (appendedPartSn >= (fragmentEntity.body.sn as number)) {
return;
}
if (!fragmentEntity.buffered && !fragmentEntity.loaded) {
if (fragmentEntity.body.type === playlistType) {
this.removeFragment(fragmentEntity.body);
@ -183,7 +176,7 @@ export class FragmentTracker implements ComponentAPI {
* Checks if the fragment passed in is loaded in the buffer properly
* Partially loaded fragments will be registered as a partial fragment
*/
private detectPartialFragments(data: FragBufferedData) {
public detectPartialFragments(data: FragBufferedData) {
const timeRanges = this.timeRanges;
const { frag, part } = data;
if (!timeRanges || frag.sn === 'initSegment') {
@ -195,13 +188,14 @@ export class FragmentTracker implements ComponentAPI {
if (!fragmentEntity) {
return;
}
Object.keys(timeRanges).forEach((elementaryStream) => {
const isFragHint = !frag.relurl;
Object.keys(timeRanges).forEach((elementaryStream: SourceBufferName) => {
const streamInfo = frag.elementaryStreams[elementaryStream];
if (!streamInfo) {
return;
}
const timeRange = timeRanges[elementaryStream];
const partial = part !== null || streamInfo.partial === true;
const timeRange = timeRanges[elementaryStream] as TimeRanges;
const partial = isFragHint || streamInfo.partial === true;
fragmentEntity.range[elementaryStream] = this.getBufferedTimes(
frag,
part,
@ -215,15 +209,41 @@ export class FragmentTracker implements ComponentAPI {
if (fragmentEntity.body.endList) {
this.endListFragments[fragmentEntity.body.type] = fragmentEntity;
}
if (!isPartial(fragmentEntity)) {
// Remove older fragment parts from lookup after frag is tracked as buffered
this.removeParts((frag.sn as number) - 1, frag.type);
}
} else {
// remove fragment if nothing was appended
this.removeFragment(fragmentEntity.body);
}
}
public fragBuffered(frag: Fragment) {
private removeParts(snToKeep: number, levelType: PlaylistLevelType) {
const activeParts = this.activePartLists[levelType];
if (!activeParts) {
return;
}
this.activePartLists[levelType] = activeParts.filter(
(part) => (part.fragment.sn as number) >= snToKeep
);
}
public fragBuffered(frag: Fragment, force?: true) {
const fragKey = getFragmentKey(frag);
const fragmentEntity = this.fragments[fragKey];
let fragmentEntity = this.fragments[fragKey];
if (!fragmentEntity && force) {
fragmentEntity = this.fragments[fragKey] = {
body: frag,
appendedPTS: null,
loaded: null,
buffered: false,
range: Object.create(null),
};
if (frag.gap) {
this.hasGaps = true;
}
}
if (fragmentEntity) {
fragmentEntity.loaded = null;
fragmentEntity.buffered = true;
@ -240,8 +260,8 @@ export class FragmentTracker implements ComponentAPI {
time: [],
partial,
};
const startPTS = part ? part.start : fragment.start;
const endPTS = part ? part.end : fragment.end;
const startPTS = fragment.start;
const endPTS = fragment.end;
const minEndPTS = fragment.minEndPTS || endPTS;
const maxStartPTS = fragment.maxStartPTS || startPTS;
for (let i = 0; i < timeRange.length; i++) {
@ -354,15 +374,18 @@ export class FragmentTracker implements ComponentAPI {
const { frag, part } = data;
// don't track initsegment (for which sn is not a number)
// don't track frags used for bitrateTest, they're irrelevant.
// don't track parts for memory efficiency
if (frag.sn === 'initSegment' || frag.bitrateTest || part) {
if (frag.sn === 'initSegment' || frag.bitrateTest) {
return;
}
// Fragment entity `loaded` FragLoadedData is null when loading parts
const loaded = part ? null : data;
const fragKey = getFragmentKey(frag);
this.fragments[fragKey] = {
body: frag,
loaded: data,
appendedPTS: null,
loaded,
buffered: false,
range: Object.create(null),
};
@ -373,40 +396,27 @@ export class FragmentTracker implements ComponentAPI {
data: BufferAppendedData
) {
const { frag, part, timeRanges } = data;
if (frag.type === PlaylistLevelType.MAIN) {
if (this.activeFragment !== frag) {
this.activeFragment = frag;
frag.appendedPTS = undefined;
}
if (part) {
let activeParts = this.activeParts;
if (!activeParts) {
this.activeParts = activeParts = [];
}
activeParts.push(part);
} else {
this.activeParts = null;
if (frag.sn === 'initSegment') {
return;
}
const playlistType = frag.type;
if (part) {
let activeParts = this.activePartLists[playlistType];
if (!activeParts) {
this.activePartLists[playlistType] = activeParts = [];
}
activeParts.push(part);
}
// Store the latest timeRanges loaded in the buffer
this.timeRanges = timeRanges;
Object.keys(timeRanges).forEach((elementaryStream: SourceBufferName) => {
const timeRange = timeRanges[elementaryStream] as TimeRanges;
this.detectEvictedFragments(elementaryStream, timeRange);
if (!part && frag.type === PlaylistLevelType.MAIN) {
const streamInfo = frag.elementaryStreams[elementaryStream];
if (!streamInfo) {
return;
}
for (let i = 0; i < timeRange.length; i++) {
const rangeEnd = timeRange.end(i);
if (rangeEnd <= streamInfo.endPTS && rangeEnd > streamInfo.startPTS) {
frag.appendedPTS = Math.max(rangeEnd, frag.appendedPTS || 0);
} else {
frag.appendedPTS = streamInfo.endPTS;
}
}
}
this.detectEvictedFragments(
elementaryStream,
timeRange,
playlistType,
part
);
});
}
@ -419,25 +429,35 @@ export class FragmentTracker implements ComponentAPI {
return !!this.fragments[fragKey];
}
public hasParts(type: PlaylistLevelType): boolean {
return !!this.activePartLists[type]?.length;
}
public removeFragmentsInRange(
start: number,
end: number,
playlistType: PlaylistLevelType
playlistType: PlaylistLevelType,
withGapOnly?: boolean,
unbufferedOnly?: boolean
) {
if (withGapOnly && !this.hasGaps) {
return;
}
Object.keys(this.fragments).forEach((key) => {
const fragmentEntity = this.fragments[key];
if (!fragmentEntity) {
return;
}
if (fragmentEntity.buffered) {
const frag = fragmentEntity.body;
if (
frag.type === playlistType &&
frag.start < end &&
frag.end > start
) {
this.removeFragment(frag);
}
const frag = fragmentEntity.body;
if (frag.type !== playlistType || (withGapOnly && !frag.gap)) {
return;
}
if (
frag.start < end &&
frag.end > start &&
(fragmentEntity.buffered || unbufferedOnly)
) {
this.removeFragment(frag);
}
});
}
@ -446,7 +466,13 @@ export class FragmentTracker implements ComponentAPI {
const fragKey = getFragmentKey(fragment);
fragment.stats.loaded = 0;
fragment.clearElementaryStreamInfo();
fragment.appendedPTS = undefined;
const activeParts = this.activePartLists[fragment.type];
if (activeParts) {
const snToRemove = fragment.sn;
this.activePartLists[fragment.type] = activeParts.filter(
(part) => part.fragment.sn !== snToRemove
);
}
delete this.fragments[fragKey];
if (fragment.endList) {
delete this.endListFragments[fragment.type];
@ -456,15 +482,18 @@ export class FragmentTracker implements ComponentAPI {
public removeAllFragments() {
this.fragments = Object.create(null);
this.endListFragments = Object.create(null);
this.activeFragment = null;
this.activeParts = null;
this.activePartLists = Object.create(null);
this.hasGaps = false;
}
}
function isPartial(fragmentEntity: FragmentEntity): boolean {
return (
fragmentEntity.buffered &&
(fragmentEntity.range.video?.partial || fragmentEntity.range.audio?.partial)
(fragmentEntity.body.gap ||
fragmentEntity.range.video?.partial ||
fragmentEntity.range.audio?.partial ||
fragmentEntity.range.audiovideo?.partial)
);
}

View file

@ -1,6 +1,7 @@
import type { BufferInfo } from '../utils/buffer-helper';
import { BufferHelper } from '../utils/buffer-helper';
import { ErrorTypes, ErrorDetails } from '../errors';
import { PlaylistLevelType } from '../types/loader';
import { Events } from '../events';
import { logger } from '../utils/logger';
import type Hls from '../hls';
@ -41,7 +42,7 @@ export default class GapController {
* Checks if the playhead is stuck within a gap, and if so, attempts to free it.
* A gap is an unbuffered range between two buffered ranges (or the start and the first buffered range).
*
* @param {number} lastCurrentTime Previously read playhead position
* @param lastCurrentTime - Previously read playhead position
*/
public poll(lastCurrentTime: number, activeFrag: Fragment | null) {
const { config, media, stalled } = this;
@ -77,6 +78,7 @@ export default class GapController {
// Clear stalled state when beginning or finishing seeking so that we don't report stalls coming out of a seek
if (beginSeek || seeked) {
this.stalled = null;
return;
}
// The playhead should not be moving
@ -131,8 +133,9 @@ export default class GapController {
const maxStartGapJump = isLive
? level!.details!.targetduration * 2
: MAX_START_GAP_JUMP;
if (startJump > 0 && startJump <= maxStartGapJump) {
this._trySkipBufferHole(null);
const partialOrGap = this.fragmentTracker.getPartialFragment(currentTime);
if (startJump > 0 && (startJump <= maxStartGapJump || partialOrGap)) {
this._trySkipBufferHole(partialOrGap);
return;
}
}
@ -183,7 +186,7 @@ export default class GapController {
// This method isn't limited by the size of the gap between buffered ranges
const targetTime = this._trySkipBufferHole(partial);
// we return here in this case, meaning
// the branch below only executes when we don't handle a partial fragment
// the branch below only executes when we haven't seeked to a new position
if (targetTime || !this.media) {
return;
}
@ -194,7 +197,9 @@ export default class GapController {
// needs to cross some sort of threshold covering all source-buffers content
// to start playing properly.
if (
bufferInfo.len > config.maxBufferHole &&
(bufferInfo.len > config.maxBufferHole ||
(bufferInfo.nextStart &&
bufferInfo.nextStart - currentTime < config.maxBufferHole)) &&
stalledDurationMs > config.highBufferWatchdogPeriod * 1000
) {
logger.warn('Trying to nudge playhead over buffer-hole');
@ -216,15 +221,17 @@ export default class GapController {
if (!stallReported && media) {
// Report stalled error once
this.stallReported = true;
logger.warn(
const error = new Error(
`Playback stalling at @${
media.currentTime
} due to low buffer (${JSON.stringify(bufferInfo)})`
);
logger.warn(error.message);
hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.BUFFER_STALLED_ERROR,
fatal: false,
error,
buffer: bufferInfo.len,
});
}
@ -240,19 +247,59 @@ export default class GapController {
if (media === null) {
return 0;
}
const currentTime = media.currentTime;
let lastEndTime = 0;
// Check if currentTime is between unbuffered regions of partial fragments
const buffered = BufferHelper.getBuffered(media);
for (let i = 0; i < buffered.length; i++) {
const startTime = buffered.start(i);
if (
currentTime + config.maxBufferHole >= lastEndTime &&
currentTime < startTime
) {
const currentTime = media.currentTime;
const bufferInfo = BufferHelper.bufferInfo(media, currentTime, 0);
const startTime =
currentTime < bufferInfo.start ? bufferInfo.start : bufferInfo.nextStart;
if (startTime) {
const bufferStarved = bufferInfo.len <= config.maxBufferHole;
const waiting =
bufferInfo.len > 0 && bufferInfo.len < 1 && media.readyState < 3;
const gapLength = startTime - currentTime;
if (gapLength > 0 && (bufferStarved || waiting)) {
// Only allow large gaps to be skipped if it is a start gap, or all fragments in skip range are partial
if (gapLength > config.maxBufferHole) {
const { fragmentTracker } = this;
let startGap = false;
if (currentTime === 0) {
const startFrag = fragmentTracker.getAppendedFrag(
0,
PlaylistLevelType.MAIN
);
if (startFrag && startTime < startFrag.end) {
startGap = true;
}
}
if (!startGap) {
const startProvisioned =
partial ||
fragmentTracker.getAppendedFrag(
currentTime,
PlaylistLevelType.MAIN
);
if (startProvisioned) {
let moreToLoad = false;
let pos = startProvisioned.end;
while (pos < startTime) {
const provisioned = fragmentTracker.getPartialFragment(pos);
if (provisioned) {
pos += provisioned.duration;
} else {
moreToLoad = true;
break;
}
}
if (moreToLoad) {
return 0;
}
}
}
}
const targetTime = Math.max(
startTime + SKIP_BUFFER_RANGE_START,
media.currentTime + SKIP_BUFFER_HOLE_STEP_SECONDS
currentTime + SKIP_BUFFER_HOLE_STEP_SECONDS
);
logger.warn(
`skipping hole, adjusting currentTime from ${currentTime} to ${targetTime}`
@ -260,18 +307,21 @@ export default class GapController {
this.moved = true;
this.stalled = null;
media.currentTime = targetTime;
if (partial) {
if (partial && !partial.gap) {
const error = new Error(
`fragment loaded with buffer holes, seeking from ${currentTime} to ${targetTime}`
);
hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.BUFFER_SEEK_OVER_HOLE,
fatal: false,
reason: `fragment loaded with buffer holes, seeking from ${currentTime} to ${targetTime}`,
error,
reason: error.message,
frag: partial,
});
}
return targetTime;
}
lastEndTime = buffered.end(i);
}
return 0;
}
@ -291,20 +341,26 @@ export default class GapController {
if (nudgeRetry < config.nudgeMaxRetry) {
const targetTime = currentTime + (nudgeRetry + 1) * config.nudgeOffset;
// playback stalled in buffered area ... let's nudge currentTime to try to overcome this
logger.warn(`Nudging 'currentTime' from ${currentTime} to ${targetTime}`);
const error = new Error(
`Nudging 'currentTime' from ${currentTime} to ${targetTime}`
);
logger.warn(error.message);
media.currentTime = targetTime;
hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.BUFFER_NUDGE_ON_STALL,
error,
fatal: false,
});
} else {
logger.error(
const error = new Error(
`Playhead still not moving while enough data buffered @${currentTime} after ${config.nudgeMaxRetry} nudges`
);
logger.error(error.message);
hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.BUFFER_STALLED_ERROR,
error,
fatal: true,
});
}

View file

@ -5,7 +5,11 @@ import {
removeCuesInRange,
} from '../utils/texttrack-utils';
import * as ID3 from '../demux/id3';
import { DateRange, DateRangeAttribute } from '../loader/date-range';
import {
DateRange,
isDateRangeCueAttribute,
isSCTE35Attribute,
} from '../loader/date-range';
import { MetadataSchema } from '../types/demuxer';
import type {
BufferFlushingData,
@ -27,6 +31,8 @@ type Cue = VTTCue | TextTrackCue;
const MIN_CUE_DURATION = 0.25;
function getCueClass() {
if (typeof self === 'undefined') return undefined;
// Attempt to recreate Safari functionality by creating
// WebKitDataCue objects when available and store the decoded
// ID3 data in the value property of the cue
@ -200,7 +206,7 @@ class ID3TrackController implements ComponentAPI {
// Safari doesn't put the timestamp frame in the TextTrack
if (!ID3.isTimeStampFrame(frame)) {
// add a bounds to any unbounded cues
this.updateId3CueEnds(startTime);
this.updateId3CueEnds(startTime, type);
const cue = new Cue(startTime, endTime, '');
cue.value = frame;
@ -214,12 +220,16 @@ class ID3TrackController implements ComponentAPI {
}
}
updateId3CueEnds(startTime: number) {
updateId3CueEnds(startTime: number, type: MetadataSchema) {
const cues = this.id3Track?.cues;
if (cues) {
for (let i = cues.length; i--; ) {
const cue = cues[i] as any;
if (cue.startTime < startTime && cue.endTime === MAX_CUE_ENDTIME) {
if (
cue.type === type &&
cue.startTime < startTime &&
cue.endTime === MAX_CUE_ENDTIME
) {
cue.endTime = startTime;
}
}
@ -337,14 +347,7 @@ class ID3TrackController implements ComponentAPI {
const attributes = Object.keys(dateRange.attr);
for (let j = 0; j < attributes.length; j++) {
const key = attributes[j];
if (
key === DateRangeAttribute.ID ||
key === DateRangeAttribute.CLASS ||
key === DateRangeAttribute.START_DATE ||
key === DateRangeAttribute.DURATION ||
key === DateRangeAttribute.END_DATE ||
key === DateRangeAttribute.END_ON_NEXT
) {
if (!isDateRangeCueAttribute(key)) {
continue;
}
let cue = cues[key] as any;
@ -355,14 +358,12 @@ class ID3TrackController implements ComponentAPI {
} else {
let data = dateRange.attr[key];
cue = new Cue(startTime, endTime, '');
if (
key === DateRangeAttribute.SCTE35_OUT ||
key === DateRangeAttribute.SCTE35_IN
) {
if (isSCTE35Attribute(key)) {
data = hexToArrayBuffer(data);
}
cue.value = { key, data };
cue.type = MetadataSchema.dateRange;
cue.id = id;
this.id3Track.addCue(cue);
cues[key] = cue;
}

View file

@ -138,11 +138,11 @@ export default class LatencyController implements ComponentAPI {
}
private unregisterListeners() {
this.hls.off(Events.MEDIA_ATTACHED, this.onMediaAttached);
this.hls.off(Events.MEDIA_DETACHING, this.onMediaDetaching);
this.hls.off(Events.MANIFEST_LOADING, this.onManifestLoading);
this.hls.off(Events.LEVEL_UPDATED, this.onLevelUpdated);
this.hls.off(Events.ERROR, this.onError);
this.hls.off(Events.MEDIA_ATTACHED, this.onMediaAttached, this);
this.hls.off(Events.MEDIA_DETACHING, this.onMediaDetaching, this);
this.hls.off(Events.MANIFEST_LOADING, this.onManifestLoading, this);
this.hls.off(Events.LEVEL_UPDATED, this.onLevelUpdated, this);
this.hls.off(Events.ERROR, this.onError, this);
}
private onMediaAttached(
@ -184,9 +184,11 @@ export default class LatencyController implements ComponentAPI {
return;
}
this.stallCount++;
logger.warn(
'[playback-rate-controller]: Stall detected, adjusting target latency'
);
if (this.levelDetails?.live) {
logger.warn(
'[playback-rate-controller]: Stall detected, adjusting target latency'
);
}
}
private timeupdate() {

View file

@ -10,40 +10,48 @@ import {
FragLoadedData,
ErrorData,
LevelSwitchingData,
LevelsUpdatedData,
ManifestLoadingData,
} from '../types/events';
import { HdcpLevel, HdcpLevels, Level } from '../types/level';
import { Level } from '../types/level';
import { Events } from '../events';
import { ErrorTypes, ErrorDetails } from '../errors';
import { isCodecSupportedInMp4 } from '../utils/codecs';
import { addGroupId, assignTrackIdsByGroup } from './level-helper';
import BasePlaylistController from './base-playlist-controller';
import { PlaylistContextType, PlaylistLevelType } from '../types/loader';
import type Hls from '../hls';
import type { HlsUrlParameters, LevelParsed } from '../types/level';
import type { MediaPlaylist } from '../types/media-playlist';
import ContentSteeringController from './content-steering-controller';
const chromeOrFirefox: boolean = /chrome|firefox/.test(
navigator.userAgent.toLowerCase()
);
let chromeOrFirefox: boolean;
export default class LevelController extends BasePlaylistController {
private _levels: Level[] = [];
private _firstLevel: number = -1;
private _startLevel?: number;
private currentLevel: Level | null = null;
private currentLevelIndex: number = -1;
private manualLevelIndex: number = -1;
private steering: ContentSteeringController | null;
public onParsedComplete!: Function;
constructor(hls: Hls) {
constructor(
hls: Hls,
contentSteeringController: ContentSteeringController | null
) {
super(hls, '[level-controller]');
this.steering = contentSteeringController;
this._registerListeners();
}
private _registerListeners() {
const { hls } = this;
hls.on(Events.MANIFEST_LOADING, this.onManifestLoading, this);
hls.on(Events.MANIFEST_LOADED, this.onManifestLoaded, this);
hls.on(Events.LEVEL_LOADED, this.onLevelLoaded, this);
hls.on(Events.LEVELS_UPDATED, this.onLevelsUpdated, this);
hls.on(Events.AUDIO_TRACK_SWITCHED, this.onAudioTrackSwitched, this);
hls.on(Events.FRAG_LOADED, this.onFragLoaded, this);
hls.on(Events.ERROR, this.onError, this);
@ -51,8 +59,10 @@ export default class LevelController extends BasePlaylistController {
private _unregisterListeners() {
const { hls } = this;
hls.off(Events.MANIFEST_LOADING, this.onManifestLoading, this);
hls.off(Events.MANIFEST_LOADED, this.onManifestLoaded, this);
hls.off(Events.LEVEL_LOADED, this.onLevelLoaded, this);
hls.off(Events.LEVELS_UPDATED, this.onLevelsUpdated, this);
hls.off(Events.AUDIO_TRACK_SWITCHED, this.onAudioTrackSwitched, this);
hls.off(Events.FRAG_LOADED, this.onFragLoaded, this);
hls.off(Events.ERROR, this.onError, this);
@ -60,8 +70,8 @@ export default class LevelController extends BasePlaylistController {
public destroy() {
this._unregisterListeners();
this.manualLevelIndex = -1;
this._levels.length = 0;
this.steering = null;
this.resetLevels();
super.destroy();
}
@ -71,45 +81,60 @@ export default class LevelController extends BasePlaylistController {
// clean up live level details to force reload them, and reset load errors
levels.forEach((level) => {
level.loadError = 0;
level.fragmentError = 0;
});
super.startLoad();
}
private resetLevels() {
this._startLevel = undefined;
this.manualLevelIndex = -1;
this.currentLevelIndex = -1;
this.currentLevel = null;
this._levels = [];
}
private onManifestLoading(
event: Events.MANIFEST_LOADING,
data: ManifestLoadingData
) {
this.resetLevels();
}
protected onManifestLoaded(
event: Events.MANIFEST_LOADED,
data: ManifestLoadedData
): void {
let levels: Level[] = [];
let audioTracks: MediaPlaylist[] = [];
let subtitleTracks: MediaPlaylist[] = [];
let bitrateStart: number | undefined;
) {
const levels: Level[] = [];
const levelSet: { [key: string]: Level } = {};
let levelFromSet: Level;
let resolutionFound = false;
let videoCodecFound = false;
let audioCodecFound = false;
// regroup redundant levels together
data.levels.forEach((levelParsed: LevelParsed) => {
const attributes = levelParsed.attrs;
resolutionFound =
resolutionFound || !!(levelParsed.width && levelParsed.height);
videoCodecFound = videoCodecFound || !!levelParsed.videoCodec;
audioCodecFound = audioCodecFound || !!levelParsed.audioCodec;
// erase audio codec info if browser does not support mp4a.40.34.
// demuxer will autodetect codec and fallback to mpeg/audio
if (
chromeOrFirefox &&
levelParsed.audioCodec &&
levelParsed.audioCodec.indexOf('mp4a.40.34') !== -1
) {
levelParsed.audioCodec = undefined;
if (levelParsed.audioCodec?.indexOf('mp4a.40.34') !== -1) {
chromeOrFirefox ||= /chrome|firefox/i.test(navigator.userAgent);
if (chromeOrFirefox) {
levelParsed.audioCodec = undefined;
}
}
const levelKey = `${levelParsed.bitrate}-${levelParsed.attrs.RESOLUTION}-${levelParsed.attrs.CODECS}`;
const {
AUDIO,
CODECS,
'FRAME-RATE': FRAMERATE,
'PATHWAY-ID': PATHWAY,
RESOLUTION,
SUBTITLES,
} = attributes;
const contentSteeringPrefix = __USE_CONTENT_STEERING__
? `${PATHWAY || '.'}-`
: '';
const levelKey = `${contentSteeringPrefix}${levelParsed.bitrate}-${RESOLUTION}-${FRAMERATE}-${CODECS}`;
levelFromSet = levelSet[levelKey];
if (!levelFromSet) {
@ -117,19 +142,41 @@ export default class LevelController extends BasePlaylistController {
levelSet[levelKey] = levelFromSet;
levels.push(levelFromSet);
} else {
levelFromSet.url.push(levelParsed.url);
levelFromSet.addFallback(levelParsed);
}
if (attributes) {
if (attributes.AUDIO) {
addGroupId(levelFromSet, 'audio', attributes.AUDIO);
}
if (attributes.SUBTITLES) {
addGroupId(levelFromSet, 'text', attributes.SUBTITLES);
}
}
addGroupId(levelFromSet, 'audio', AUDIO);
addGroupId(levelFromSet, 'text', SUBTITLES);
});
this.filterAndSortMediaOptions(levels, data);
}
private filterAndSortMediaOptions(
unfilteredLevels: Level[],
data: ManifestLoadedData
) {
let audioTracks: MediaPlaylist[] = [];
let subtitleTracks: MediaPlaylist[] = [];
let resolutionFound = false;
let videoCodecFound = false;
let audioCodecFound = false;
// only keep levels with supported audio/video codecs
let levels = unfilteredLevels.filter(
({ audioCodec, videoCodec, width, height, unknownCodecs }) => {
resolutionFound ||= !!(width && height);
videoCodecFound ||= !!videoCodec;
audioCodecFound ||= !!audioCodec;
return (
!unknownCodecs?.length &&
(!audioCodec || isCodecSupportedInMp4(audioCodec, 'audio')) &&
(!videoCodec || isCodecSupportedInMp4(videoCodec, 'video'))
);
}
);
// remove audio-only level if we also have levels with video codecs or RESOLUTION signalled
if ((resolutionFound || videoCodecFound) && audioCodecFound) {
levels = levels.filter(
@ -137,13 +184,25 @@ export default class LevelController extends BasePlaylistController {
);
}
// only keep levels with supported audio/video codecs
levels = levels.filter(({ audioCodec, videoCodec }) => {
return (
(!audioCodec || isCodecSupportedInMp4(audioCodec, 'audio')) &&
(!videoCodec || isCodecSupportedInMp4(videoCodec, 'video'))
);
});
if (levels.length === 0) {
// Dispatch error after MANIFEST_LOADED is done propagating
Promise.resolve().then(() => {
if (this.hls) {
const error = new Error(
'no level with compatible codecs found in manifest'
);
this.hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.MANIFEST_INCOMPATIBLE_CODECS_ERROR,
fatal: true,
url: data.url,
error,
reason: error.message,
});
}
});
return;
}
if (data.audioTracks) {
audioTracks = data.audioTracks.filter(
@ -158,72 +217,82 @@ export default class LevelController extends BasePlaylistController {
subtitleTracks = data.subtitles;
assignTrackIdsByGroup(subtitleTracks);
}
// start bitrate is the first bitrate of the manifest
const unsortedLevels = levels.slice(0);
// sort levels from lowest to highest
levels.sort((a, b) => {
if (a.attrs['HDCP-LEVEL'] !== b.attrs['HDCP-LEVEL']) {
return (a.attrs['HDCP-LEVEL'] || '') > (b.attrs['HDCP-LEVEL'] || '')
? 1
: -1;
}
if (a.bitrate !== b.bitrate) {
return a.bitrate - b.bitrate;
}
if (a.attrs['FRAME-RATE'] !== b.attrs['FRAME-RATE']) {
return (
a.attrs.decimalFloatingPoint('FRAME-RATE') -
b.attrs.decimalFloatingPoint('FRAME-RATE')
);
}
if (a.attrs.SCORE !== b.attrs.SCORE) {
return (
a.attrs.decimalFloatingPoint('SCORE') -
b.attrs.decimalFloatingPoint('SCORE')
);
}
if (resolutionFound && a.height !== b.height) {
return a.height - b.height;
}
return 0;
});
if (levels.length > 0) {
// start bitrate is the first bitrate of the manifest
bitrateStart = levels[0].bitrate;
// sort levels from lowest to highest
levels.sort((a, b) => {
if (a.attrs['HDCP-LEVEL'] !== b.attrs['HDCP-LEVEL']) {
return (a.attrs['HDCP-LEVEL'] || '') > (b.attrs['HDCP-LEVEL'] || '')
? 1
: -1;
}
if (a.bitrate !== b.bitrate) {
return a.bitrate - b.bitrate;
}
if (a.attrs.SCORE !== b.attrs.SCORE) {
return (
a.attrs.decimalFloatingPoint('SCORE') -
b.attrs.decimalFloatingPoint('SCORE')
);
}
if (resolutionFound && a.height !== b.height) {
return a.height - b.height;
}
return 0;
});
this._levels = levels;
// find index of first level in sorted levels
for (let i = 0; i < levels.length; i++) {
if (levels[i].bitrate === bitrateStart) {
this._firstLevel = i;
this.log(
`manifest loaded, ${levels.length} level(s) found, first bitrate: ${bitrateStart}`
);
break;
let firstLevelInPlaylist = unsortedLevels[0];
if (this.steering) {
levels = this.steering.filterParsedLevels(levels);
if (levels.length !== unsortedLevels.length) {
for (let i = 0; i < unsortedLevels.length; i++) {
if (unsortedLevels[i].pathwayId === levels[0].pathwayId) {
firstLevelInPlaylist = unsortedLevels[i];
break;
}
}
}
}
// Audio is only alternate if manifest include a URI along with the audio group tag,
// and this is not an audio-only stream where levels contain audio-only
const audioOnly = audioCodecFound && !videoCodecFound;
const edata: ManifestParsedData = {
levels,
audioTracks,
subtitleTracks,
sessionData: data.sessionData,
sessionKeys: data.sessionKeys,
firstLevel: this._firstLevel,
stats: data.stats,
audio: audioCodecFound,
video: videoCodecFound,
altAudio: !audioOnly && audioTracks.some((t) => !!t.url),
};
this.hls.trigger(Events.MANIFEST_PARSED, edata);
this._levels = levels;
// Initiate loading after all controllers have received MANIFEST_PARSED
if (this.hls.config.autoStartLoad || this.hls.forceStartLoad) {
this.hls.startLoad(this.hls.config.startPosition);
// find index of first level in sorted levels
for (let i = 0; i < levels.length; i++) {
if (levels[i] === firstLevelInPlaylist) {
this._firstLevel = i;
this.log(
`manifest loaded, ${levels.length} level(s) found, first bitrate: ${firstLevelInPlaylist.bitrate}`
);
break;
}
} else {
this.hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.MANIFEST_INCOMPATIBLE_CODECS_ERROR,
fatal: true,
url: data.url,
reason: 'no level with compatible codecs found in manifest',
});
}
// Audio is only alternate if manifest include a URI along with the audio group tag,
// and this is not an audio-only stream where levels contain audio-only
const audioOnly = audioCodecFound && !videoCodecFound;
const edata: ManifestParsedData = {
levels,
audioTracks,
subtitleTracks,
sessionData: data.sessionData,
sessionKeys: data.sessionKeys,
firstLevel: this._firstLevel,
stats: data.stats,
audio: audioCodecFound,
video: videoCodecFound,
altAudio: !audioOnly && audioTracks.some((t) => !!t.url),
};
this.hls.trigger(Events.MANIFEST_PARSED, edata);
// Initiate loading after all controllers have received MANIFEST_PARSED
if (this.hls.config.autoStartLoad || this.hls.forceStartLoad) {
this.hls.startLoad(this.hls.config.startPosition);
}
}
@ -243,19 +312,18 @@ export default class LevelController extends BasePlaylistController {
if (levels.length === 0) {
return;
}
if (this.currentLevelIndex === newLevel && levels[newLevel]?.details) {
return;
}
// check if level idx is valid
if (newLevel < 0 || newLevel >= levels.length) {
// invalid level id given, trigger error
const error = new Error('invalid level idx');
const fatal = newLevel < 0;
this.hls.trigger(Events.ERROR, {
type: ErrorTypes.OTHER_ERROR,
details: ErrorDetails.LEVEL_SWITCH_ERROR,
level: newLevel,
fatal,
reason: 'invalid level idx',
error,
reason: error.message,
});
if (fatal) {
return;
@ -263,22 +331,41 @@ export default class LevelController extends BasePlaylistController {
newLevel = Math.min(newLevel, levels.length - 1);
}
// stopping live reloading timer if any
this.clearTimer();
const lastLevelIndex = this.currentLevelIndex;
const lastLevel = levels[lastLevelIndex];
const lastLevel = this.currentLevel;
const lastPathwayId = lastLevel ? lastLevel.attrs['PATHWAY-ID'] : undefined;
const level = levels[newLevel];
this.log(`switching to level ${newLevel} from ${lastLevelIndex}`);
const pathwayId = level.attrs['PATHWAY-ID'];
this.currentLevelIndex = newLevel;
this.currentLevel = level;
if (
lastLevelIndex === newLevel &&
level.details &&
lastLevel &&
lastPathwayId === pathwayId
) {
return;
}
this.log(
`Switching to level ${newLevel}${
pathwayId ? ' with Pathway ' + pathwayId : ''
} from level ${lastLevelIndex}${
lastPathwayId ? ' with Pathway ' + lastPathwayId : ''
}`
);
const levelSwitchingData: LevelSwitchingData = Object.assign({}, level, {
level: newLevel,
maxBitrate: level.maxBitrate,
attrs: level.attrs,
uri: level.uri,
urlId: level.urlId,
});
// @ts-ignore
delete levelSwitchingData._attrs;
// @ts-ignore
delete levelSwitchingData._urlId;
this.hls.trigger(Events.LEVEL_SWITCHING, levelSwitchingData);
// check if we need to load playlist for this level
@ -333,170 +420,15 @@ export default class LevelController extends BasePlaylistController {
}
protected onError(event: Events.ERROR, data: ErrorData) {
super.onError(event, data);
if (data.fatal) {
if (data.fatal || !data.context) {
return;
}
// Switch to redundant level when track fails to load
const context = data.context;
const level = this._levels[this.currentLevelIndex];
if (
context &&
((context.type === PlaylistContextType.AUDIO_TRACK &&
level.audioGroupIds &&
context.groupId === level.audioGroupIds[level.urlId]) ||
(context.type === PlaylistContextType.SUBTITLE_TRACK &&
level.textGroupIds &&
context.groupId === level.textGroupIds[level.urlId]))
data.context.type === PlaylistContextType.LEVEL &&
data.context.level === this.level
) {
this.redundantFailover(this.currentLevelIndex);
return;
}
let levelError = false;
let levelSwitch = true;
let levelIndex;
// try to recover not fatal errors
switch (data.details) {
case ErrorDetails.FRAG_LOAD_ERROR:
case ErrorDetails.FRAG_LOAD_TIMEOUT:
case ErrorDetails.KEY_LOAD_ERROR:
case ErrorDetails.KEY_LOAD_TIMEOUT:
if (data.frag) {
// Share fragment error count accross media options (main, audio, subs)
// This allows for level based rendition switching when media option assets fail
const variantLevelIndex =
data.frag.type === PlaylistLevelType.MAIN
? data.frag.level
: this.currentLevelIndex;
const level = this._levels[variantLevelIndex];
// Set levelIndex when we're out of fragment retries
if (level) {
level.fragmentError++;
if (level.fragmentError > this.hls.config.fragLoadingMaxRetry) {
levelIndex = variantLevelIndex;
}
} else {
levelIndex = variantLevelIndex;
}
}
break;
case ErrorDetails.KEY_SYSTEM_STATUS_OUTPUT_RESTRICTED: {
const restrictedHdcpLevel = level.attrs['HDCP-LEVEL'];
if (restrictedHdcpLevel) {
this.hls.maxHdcpLevel =
HdcpLevels[
HdcpLevels.indexOf(restrictedHdcpLevel as HdcpLevel) - 1
];
this.warn(
`Restricting playback to HDCP-LEVEL of "${this.hls.maxHdcpLevel}" or lower`
);
}
}
// eslint-disable-next-line no-fallthrough
case ErrorDetails.FRAG_PARSING_ERROR:
case ErrorDetails.KEY_SYSTEM_NO_SESSION:
levelIndex =
data.frag?.type === PlaylistLevelType.MAIN
? data.frag.level
: this.currentLevelIndex;
// Do not retry level. Escalate to fatal if switching levels fails.
data.levelRetry = false;
break;
case ErrorDetails.LEVEL_LOAD_ERROR:
case ErrorDetails.LEVEL_LOAD_TIMEOUT:
// Do not perform level switch if an error occurred using delivery directives
// Attempt to reload level without directives first
if (context) {
if (context.deliveryDirectives) {
levelSwitch = false;
}
levelIndex = context.level;
}
levelError = true;
break;
case ErrorDetails.REMUX_ALLOC_ERROR:
levelIndex = data.level ?? this.currentLevelIndex;
levelError = true;
break;
}
if (levelIndex !== undefined) {
this.recoverLevel(data, levelIndex, levelError, levelSwitch);
}
}
/**
* Switch to a redundant stream if any available.
* If redundant stream is not available, emergency switch down if ABR mode is enabled.
*/
private recoverLevel(
errorEvent: ErrorData,
levelIndex: number,
levelError: boolean,
levelSwitch: boolean
): void {
const { details: errorDetails } = errorEvent;
const level = this._levels[levelIndex];
level.loadError++;
if (levelError) {
const retrying = this.retryLoadingOrFail(errorEvent);
if (retrying) {
// boolean used to inform stream controller not to switch back to IDLE on non fatal error
errorEvent.levelRetry = true;
} else {
this.currentLevelIndex = -1;
return;
}
}
if (levelSwitch) {
const redundantLevels = level.url.length;
// Try redundant fail-over until level.loadError reaches redundantLevels
if (redundantLevels > 1 && level.loadError < redundantLevels) {
errorEvent.levelRetry = true;
this.redundantFailover(levelIndex);
} else if (this.manualLevelIndex === -1) {
// Search for next level to retry
let nextLevel = -1;
const levels = this._levels;
for (let i = levels.length; i--; ) {
const candidate = (i + this.currentLevelIndex) % levels.length;
if (
candidate !== this.currentLevelIndex &&
levels[candidate].loadError === 0
) {
nextLevel = candidate;
break;
}
}
if (nextLevel > -1 && this.currentLevelIndex !== nextLevel) {
this.warn(`${errorDetails}: switch to ${nextLevel}`);
errorEvent.levelRetry = true;
this.hls.nextAutoLevel = nextLevel;
} else if (errorEvent.levelRetry === false) {
// No levels to switch to and no more retries
errorEvent.fatal = true;
}
}
}
}
private redundantFailover(levelIndex: number) {
const level = this._levels[levelIndex];
const redundantLevels = level.url.length;
if (redundantLevels > 1) {
// Update the url id of all levels so that we stay on the same set of variants when level switching
const newUrlId = (level.urlId + 1) % redundantLevels;
this.warn(`Switching to redundant URL-id ${newUrlId}`);
this._levels.forEach((level) => {
level.urlId = newUrlId;
});
this.level = levelIndex;
this.checkRetry(data);
}
}
@ -505,7 +437,6 @@ export default class LevelController extends BasePlaylistController {
if (frag !== undefined && frag.type === PlaylistLevelType.MAIN) {
const level = this._levels[frag.level];
if (level !== undefined) {
level.fragmentError = 0;
level.loadError = 0;
}
}
@ -528,7 +459,6 @@ export default class LevelController extends BasePlaylistController {
// reset level load error counter on successful level loaded only if there is no issues with fragments
if (curLevel.fragmentError === 0) {
curLevel.loadError = 0;
this.retryCount = 0;
}
this.playlistLoaded(level, data, curLevel.details);
} else if (data.deliveryDirectives?.skip) {
@ -541,14 +471,17 @@ export default class LevelController extends BasePlaylistController {
event: Events.AUDIO_TRACK_SWITCHED,
data: TrackSwitchedData
) {
const currentLevel = this.hls.levels[this.currentLevelIndex];
const currentLevel = this.currentLevel;
if (!currentLevel) {
return;
}
if (currentLevel.audioGroupIds) {
const audioGroupId = this.hls.audioTracks[data.id].groupId;
if (
currentLevel.audioGroupIds &&
currentLevel.audioGroupId !== audioGroupId
) {
let urlId = -1;
const audioGroupId = this.hls.audioTracks[data.id].groupId;
for (let i = 0; i < currentLevel.audioGroupIds.length; i++) {
if (currentLevel.audioGroupIds[i] === audioGroupId) {
urlId = i;
@ -556,21 +489,23 @@ export default class LevelController extends BasePlaylistController {
}
}
if (urlId !== currentLevel.urlId) {
if (urlId !== -1 && urlId !== currentLevel.urlId) {
currentLevel.urlId = urlId;
this.startLoad();
if (this.canLoad) {
this.startLoad();
}
}
}
}
protected loadPlaylist(hlsUrlParameters?: HlsUrlParameters) {
super.loadPlaylist();
const level = this.currentLevelIndex;
const currentLevel = this._levels[level];
const currentLevelIndex = this.currentLevelIndex;
const currentLevel = this.currentLevel;
if (this.canLoad && currentLevel && currentLevel.url.length > 0) {
if (currentLevel && this.shouldLoadPlaylist(currentLevel)) {
const id = currentLevel.urlId;
let url = currentLevel.url[id];
let url = currentLevel.uri;
if (hlsUrlParameters) {
try {
url = hlsUrlParameters.addDirectives(url);
@ -581,15 +516,18 @@ export default class LevelController extends BasePlaylistController {
}
}
const pathwayId = currentLevel.attrs['PATHWAY-ID'];
this.log(
`Attempt loading level index ${level}${
`Loading level index ${currentLevelIndex}${
hlsUrlParameters?.msn !== undefined
? ' at sn ' +
hlsUrlParameters.msn +
' part ' +
hlsUrlParameters.part
: ''
} with URL-id ${id} ${url}`
} with${pathwayId ? ' Pathway ' + pathwayId : ''} URI ${id + 1}/${
currentLevel.url.length
} ${url}`
);
// console.log('Current audio track group ID:', this.hls.audioTracks[this.hls.audioTrack].groupId);
@ -597,7 +535,7 @@ export default class LevelController extends BasePlaylistController {
this.clearTimer();
this.hls.trigger(Events.LEVEL_LOADING, {
url,
level,
level: currentLevelIndex,
id,
deliveryDirectives: hlsUrlParameters || null,
});
@ -621,40 +559,77 @@ export default class LevelController extends BasePlaylistController {
removeLevel(levelIndex, urlId) {
const filterLevelAndGroupByIdIndex = (url, id) => id !== urlId;
const levels = this._levels
.filter((level, index) => {
if (index !== levelIndex) {
return true;
}
const levels = this._levels.filter((level, index) => {
if (index !== levelIndex) {
return true;
}
if (level.url.length > 1 && urlId !== undefined) {
level.url = level.url.filter(filterLevelAndGroupByIdIndex);
if (level.audioGroupIds) {
level.audioGroupIds = level.audioGroupIds.filter(
filterLevelAndGroupByIdIndex
);
}
if (level.textGroupIds) {
level.textGroupIds = level.textGroupIds.filter(
filterLevelAndGroupByIdIndex
);
}
level.urlId = 0;
return true;
if (level.url.length > 1 && urlId !== undefined) {
level.url = level.url.filter(filterLevelAndGroupByIdIndex);
if (level.audioGroupIds) {
level.audioGroupIds = level.audioGroupIds.filter(
filterLevelAndGroupByIdIndex
);
}
return false;
})
.map((level, index) => {
const { details } = level;
if (details?.fragments) {
details.fragments.forEach((fragment) => {
fragment.level = index;
});
if (level.textGroupIds) {
level.textGroupIds = level.textGroupIds.filter(
filterLevelAndGroupByIdIndex
);
}
return level;
});
this._levels = levels;
level.urlId = 0;
return true;
}
if (this.steering) {
this.steering.removeLevel(level);
}
return false;
});
this.hls.trigger(Events.LEVELS_UPDATED, { levels });
}
private onLevelsUpdated(
event: Events.LEVELS_UPDATED,
{ levels }: LevelsUpdatedData
) {
levels.forEach((level, index) => {
const { details } = level;
if (details?.fragments) {
details.fragments.forEach((fragment) => {
fragment.level = index;
});
}
});
this._levels = levels;
}
}
export function addGroupId(
level: Level,
type: string,
id: string | undefined
): void {
if (!id) {
return;
}
if (type === 'audio') {
if (!level.audioGroupIds) {
level.audioGroupIds = [];
}
level.audioGroupIds[level.url.length - 1] = id;
} else if (type === 'text') {
if (!level.textGroupIds) {
level.textGroupIds = [];
}
level.textGroupIds[level.url.length - 1] = id;
}
}
function assignTrackIdsByGroup(tracks: MediaPlaylist[]): void {
const groups = {};
tracks.forEach((track) => {
const groupId = track.groupId || '';
track.id = groups[groupId] = groups[groupId] || 0;
groups[groupId]++;
});
}

View file

@ -1,44 +1,16 @@
/**
* @module LevelHelper
* Providing methods dealing with playlist sliding and drift
* */
* Provides methods dealing with playlist sliding and drift
*/
import { logger } from '../utils/logger';
import { Fragment, Part } from '../loader/fragment';
import { LevelDetails } from '../loader/level-details';
import type { Level } from '../types/level';
import type { MediaPlaylist } from '../types/media-playlist';
import { DateRange } from '../loader/date-range';
type FragmentIntersection = (oldFrag: Fragment, newFrag: Fragment) => void;
type PartIntersection = (oldPart: Part, newPart: Part) => void;
export function addGroupId(level: Level, type: string, id: string): void {
switch (type) {
case 'audio':
if (!level.audioGroupIds) {
level.audioGroupIds = [];
}
level.audioGroupIds.push(id);
break;
case 'text':
if (!level.textGroupIds) {
level.textGroupIds = [];
}
level.textGroupIds.push(id);
break;
}
}
export function assignTrackIdsByGroup(tracks: MediaPlaylist[]): void {
const groups = {};
tracks.forEach((track) => {
const groupId = track.groupId || '';
track.id = groups[groupId] = groups[groupId] || 0;
groups[groupId]++;
});
}
export function updatePTS(
fragments: Fragment[],
fromIdx: number,
@ -64,9 +36,6 @@ function updateFromToPTS(fragFrom: Fragment, fragTo: Fragment) {
duration = fragFrom.start - fragToPTS;
frag = fragTo;
}
// TODO? Drift can go either way, or the playlist could be completely accurate
// console.assert(duration > 0,
// `duration of ${duration} computed for frag ${frag.sn}, level ${frag.level}, there should be some duration drift between playlist and fragment!`);
if (frag.duration !== duration) {
frag.duration = duration;
}
@ -119,10 +88,13 @@ export function updateFragPTSDTS(
endPTS = Math.max(endPTS, fragEndPts);
endDTS = Math.max(endDTS, frag.endDTS);
}
frag.duration = endPTS - startPTS;
const drift = startPTS - frag.start;
frag.start = frag.startPTS = startPTS;
if (frag.start !== 0) {
frag.start = startPTS;
}
frag.duration = endPTS - frag.start;
frag.startPTS = startPTS;
frag.maxStartPTS = maxStartPTS;
frag.startDTS = startDTS;
frag.endPTS = endPTS;
@ -199,7 +171,6 @@ export function mergeDetails(
) {
newFrag.start = newFrag.startPTS = oldFrag.startPTS as number;
newFrag.startDTS = oldFrag.startDTS;
newFrag.appendedPTS = oldFrag.appendedPTS;
newFrag.maxStartPTS = oldFrag.maxStartPTS;
newFrag.endPTS = oldFrag.endPTS;
@ -466,7 +437,7 @@ export function getFragmentWithSN(
sn: number,
fragCurrent: Fragment | null
): Fragment | null {
if (!level || !level.details) {
if (!level?.details) {
return null;
}
const levelDetails = level.details;
@ -490,10 +461,17 @@ export function getPartWith(
sn: number,
partIndex: number
): Part | null {
if (!level || !level.details) {
if (!level?.details) {
return null;
}
const partList = level.details.partList;
return findPart(level.details?.partList, sn, partIndex);
}
export function findPart(
partList: Part[] | null | undefined,
sn: number,
partIndex: number
): Part | null {
if (partList) {
for (let i = partList.length; i--; ) {
const part = partList[i];

View file

@ -3,12 +3,12 @@ import { changeTypeSupported } from '../is-supported';
import { Events } from '../events';
import { BufferHelper, BufferInfo } from '../utils/buffer-helper';
import { FragmentState } from './fragment-tracker';
import { PlaylistLevelType } from '../types/loader';
import { PlaylistContextType, PlaylistLevelType } from '../types/loader';
import { ElementaryStreamTypes, Fragment } from '../loader/fragment';
import TransmuxerInterface from '../demux/transmuxer-interface';
import { ChunkMetadata } from '../types/transmuxer';
import GapController from './gap-controller';
import { ErrorDetails, ErrorTypes } from '../errors';
import { ErrorDetails } from '../errors';
import type { NetworkComponentAPI } from '../types/component-api';
import type Hls from '../hls';
import type { Level } from '../types/level';
@ -62,7 +62,13 @@ export default class StreamController
fragmentTracker: FragmentTracker,
keyLoader: KeyLoader
) {
super(hls, fragmentTracker, keyLoader, '[stream-controller]');
super(
hls,
fragmentTracker,
keyLoader,
'[stream-controller]',
PlaylistLevelType.MAIN
);
this._registerListeners();
}
@ -120,7 +126,6 @@ export default class StreamController
this.stopLoad();
this.setInterval(TICK_INTERVAL);
this.level = -1;
this.fragLoadError = 0;
if (!this.startFragRequested) {
// determine load level
let startLevel = hls.startLevel;
@ -166,9 +171,6 @@ export default class StreamController
protected doTick() {
switch (this.state) {
case State.IDLE:
this.doTickIdle();
break;
case State.WAITING_LEVEL: {
const { levels, level } = this;
const details = levels?.[level]?.details;
@ -178,6 +180,9 @@ export default class StreamController
}
this.state = State.IDLE;
break;
} else if (this.hls.nextLoadLevel !== this.level) {
this.state = State.IDLE;
break;
}
break;
}
@ -187,7 +192,6 @@ export default class StreamController
const retryDate = this.retryDate;
// if current time is gt than retryDate, or if media seeking let's switch to IDLE state to retry loading
if (!retryDate || now >= retryDate || this.media?.seeking) {
this.log('retryDate reached, switch back to IDLE state');
this.resetStartWhenNotLoaded(this.level);
this.state = State.IDLE;
}
@ -196,8 +200,9 @@ export default class StreamController
default:
break;
}
// check buffer
// check/update current fragment
if (this.state === State.IDLE) {
this.doTickIdle();
}
this.onTickEnd();
}
@ -226,7 +231,7 @@ export default class StreamController
return;
}
if (!levels || !levels[level]) {
if (!levels?.[level]) {
return;
}
@ -252,6 +257,9 @@ export default class StreamController
}
// set next load level : this will trigger a playlist load if needed
if (hls.loadLevel !== level && hls.manualLevel === -1) {
this.log(`Adapting to level ${level} from level ${this.level}`);
}
this.level = hls.nextLoadLevel = level;
const levelDetails = levelInfo.details;
@ -306,25 +314,30 @@ export default class StreamController
} else if (this.backtrackFragment && bufferInfo.len) {
this.backtrackFragment = null;
}
// Avoid loop loading by using nextLoadPosition set for backtracking
if (
frag &&
this.fragmentTracker.getState(frag) === FragmentState.OK &&
this.nextLoadPosition > targetBufferTime
) {
// Cleanup the fragment tracker before trying to find the next unbuffered fragment
const type =
this.audioOnly && !this.altAudio
? ElementaryStreamTypes.AUDIO
: ElementaryStreamTypes.VIDEO;
const mediaBuffer =
(type === ElementaryStreamTypes.VIDEO
? this.videoBuffer
: this.mediaBuffer) || this.media;
if (mediaBuffer) {
this.afterBufferFlushed(mediaBuffer, type, PlaylistLevelType.MAIN);
// Avoid loop loading by using nextLoadPosition set for backtracking and skipping consecutive GAP tags
if (frag && this.isLoopLoading(frag, targetBufferTime)) {
const gapStart = frag.gap;
if (!gapStart) {
// Cleanup the fragment tracker before trying to find the next unbuffered fragment
const type =
this.audioOnly && !this.altAudio
? ElementaryStreamTypes.AUDIO
: ElementaryStreamTypes.VIDEO;
const mediaBuffer =
(type === ElementaryStreamTypes.VIDEO
? this.videoBuffer
: this.mediaBuffer) || this.media;
if (mediaBuffer) {
this.afterBufferFlushed(mediaBuffer, type, PlaylistLevelType.MAIN);
}
}
frag = this.getNextFragment(this.nextLoadPosition, levelDetails);
frag = this.getNextFragmentLoopLoading(
frag,
levelDetails,
bufferInfo,
PlaylistLevelType.MAIN,
maxBufLen
);
}
if (!frag) {
return;
@ -333,51 +346,37 @@ export default class StreamController
frag = frag.initSegment;
}
this.loadFragment(frag, levelDetails, targetBufferTime);
this.loadFragment(frag, levelInfo, targetBufferTime);
}
protected loadFragment(
frag: Fragment,
levelDetails: LevelDetails,
level: Level,
targetBufferTime: number
) {
// Check if fragment is not loaded
const fragState = this.fragmentTracker.getState(frag);
this.fragCurrent = frag;
if (fragState === FragmentState.NOT_LOADED) {
if (
fragState === FragmentState.NOT_LOADED ||
fragState === FragmentState.PARTIAL
) {
if (frag.sn === 'initSegment') {
this._loadInitSegment(frag, levelDetails);
this._loadInitSegment(frag, level);
} else if (this.bitrateTest) {
this.log(
`Fragment ${frag.sn} of level ${frag.level} is being downloaded to test bitrate and will not be buffered`
);
this._loadBitrateTestFrag(frag, levelDetails);
this._loadBitrateTestFrag(frag, level);
} else {
this.startFragRequested = true;
super.loadFragment(frag, levelDetails, targetBufferTime);
super.loadFragment(frag, level, targetBufferTime);
}
} else if (fragState === FragmentState.APPENDING) {
// Lower the buffer size and try again
if (this.reduceMaxBufferLength(frag.duration)) {
this.fragmentTracker.removeFragment(frag);
}
} else if (this.media?.buffered.length === 0) {
// Stop gap for bad tracker / buffer flush behavior
this.fragmentTracker.removeAllFragments();
} else {
this.clearTrackerIfNeeded(frag);
}
}
private getAppendedFrag(position): Fragment | null {
const fragOrPart = this.fragmentTracker.getAppendedFrag(
position,
PlaylistLevelType.MAIN
);
if (fragOrPart && 'fragment' in fragOrPart) {
return fragOrPart.fragment;
}
return fragOrPart;
}
private getBufferedFrag(position) {
return this.fragmentTracker.getBufferedFrag(
position,
@ -421,6 +420,14 @@ export default class StreamController
// minus 1s to avoid video freezing, that could happen if we flush keyframe of current video ...
this.flushMainBuffer(0, fragPlayingCurrent.start - 1);
}
const levelDetails = this.getLevelDetails();
if (levelDetails?.live) {
const bufferInfo = this.getMainFwdBufferInfo();
// Do not flush in live stream with low buffer
if (!bufferInfo || bufferInfo.len < levelDetails.targetduration * 2) {
return;
}
}
if (!media.paused && levels) {
// add a safety delay of 1s
const nextLevelId = this.hls.nextLoadLevel;
@ -474,6 +481,7 @@ export default class StreamController
this.backtrackFragment = null;
if (fragCurrent) {
fragCurrent.abortRequests();
this.fragmentTracker.removeFragment(fragCurrent);
}
switch (this.state) {
case State.KEY_LOADING:
@ -541,6 +549,17 @@ export default class StreamController
this.log(`Media seeked to ${(currentTime as number).toFixed(3)}`);
}
// If seeked was issued before buffer was appended do not tick immediately
const bufferInfo = this.getMainFwdBufferInfo();
if (bufferInfo === null || bufferInfo.len === 0) {
this.warn(
`Main forward buffer length on "seeked" event ${
bufferInfo ? bufferInfo.len : 'empty'
})`
);
return;
}
// tick to speed up FRAG_CHANGED triggering
this.tick();
}
@ -552,8 +571,8 @@ export default class StreamController
this.fragmentTracker.removeAllFragments();
this.couldBacktrack = false;
this.startPosition = this.lastCurrentTime = 0;
this.fragPlaying = null;
this.backtrackFragment = null;
this.levels = this.fragPlaying = this.backtrackFragment = null;
this.altAudio = this.audioOnly = false;
}
private onManifestParsed(
@ -613,23 +632,29 @@ export default class StreamController
return;
}
this.log(
`Level ${newLevelId} loaded [${newDetails.startSN},${newDetails.endSN}], cc [${newDetails.startCC}, ${newDetails.endCC}] duration:${duration}`
`Level ${newLevelId} loaded [${newDetails.startSN},${newDetails.endSN}]${
newDetails.lastPartSn
? `[part-${newDetails.lastPartSn}-${newDetails.lastPartIndex}]`
: ''
}, cc [${newDetails.startCC}, ${newDetails.endCC}] duration:${duration}`
);
const curLevel = levels[newLevelId];
const fragCurrent = this.fragCurrent;
if (
fragCurrent &&
(this.state === State.FRAG_LOADING ||
this.state === State.FRAG_LOADING_WAITING_RETRY)
) {
if (fragCurrent.level !== data.level && fragCurrent.loader) {
this.state = State.IDLE;
this.backtrackFragment = null;
fragCurrent.abortRequests();
if (
(fragCurrent.level !== data.level ||
fragCurrent.urlId !== curLevel.urlId) &&
fragCurrent.loader
) {
this.abortCurrentFrag();
}
}
const curLevel = levels[newLevelId];
let sliding = 0;
if (newDetails.live || curLevel.details?.live) {
if (!newDetails.fragments[0]) {
@ -683,6 +708,7 @@ export default class StreamController
this.warn(
`Dropping fragment ${frag.sn} of level ${frag.level} after level details were reset`
);
this.fragmentTracker.removeFragment(frag);
return;
}
const videoCodec = currentLevel.videoCodec;
@ -735,7 +761,6 @@ export default class StreamController
// if any URL found on new audio track, it is an alternate audio track
const fromAltAudio = this.altAudio;
const altAudio = !!data.url;
const trackId = data.id;
// if we switch on main audio, ensure that main fragment scheduling is synced with media.buffered
// don't do anything if we switch to alt audio: audio stream controller is handling it.
// we will just have to change buffer scheduling on audioTrackSwitched
@ -750,6 +775,7 @@ export default class StreamController
if (fragCurrent) {
this.log('Switching to main audio track, cancel main fragment load');
fragCurrent.abortRequests();
this.fragmentTracker.removeFragment(fragCurrent);
}
// destroy transmuxer to force init segment generation (following audio switch)
this.resetTransmuxer();
@ -765,12 +791,11 @@ export default class StreamController
hls.trigger(Events.BUFFER_FLUSHING, {
startOffset: 0,
endOffset: Number.POSITIVE_INFINITY,
type: 'audio',
type: null,
});
this.fragmentTracker.removeAllFragments();
}
hls.trigger(Events.AUDIO_TRACK_SWITCHED, {
id: trackId,
});
hls.trigger(Events.AUDIO_TRACK_SWITCHED, data);
}
}
@ -857,61 +882,42 @@ export default class StreamController
}
private onError(event: Events.ERROR, data: ErrorData) {
if (data.type === ErrorTypes.KEY_SYSTEM_ERROR) {
this.onFragmentOrKeyLoadError(PlaylistLevelType.MAIN, data);
if (data.fatal) {
this.state = State.ERROR;
return;
}
switch (data.details) {
case ErrorDetails.FRAG_GAP:
case ErrorDetails.FRAG_PARSING_ERROR:
case ErrorDetails.FRAG_DECRYPT_ERROR:
case ErrorDetails.FRAG_LOAD_ERROR:
case ErrorDetails.FRAG_LOAD_TIMEOUT:
case ErrorDetails.FRAG_PARSING_ERROR:
case ErrorDetails.KEY_LOAD_ERROR:
case ErrorDetails.KEY_LOAD_TIMEOUT:
this.onFragmentOrKeyLoadError(PlaylistLevelType.MAIN, data);
break;
case ErrorDetails.LEVEL_LOAD_ERROR:
case ErrorDetails.LEVEL_LOAD_TIMEOUT:
if (this.state !== State.ERROR) {
if (data.fatal) {
// if fatal error, stop processing
this.warn(`${data.details}`);
this.state = State.ERROR;
} else {
// in case of non fatal error while loading level, if level controller is not retrying to load level , switch back to IDLE
if (!data.levelRetry && this.state === State.WAITING_LEVEL) {
this.state = State.IDLE;
}
}
case ErrorDetails.LEVEL_PARSING_ERROR:
// in case of non fatal error while loading level, if level controller is not retrying to load level, switch back to IDLE
if (
!data.levelRetry &&
this.state === State.WAITING_LEVEL &&
data.context?.type === PlaylistContextType.LEVEL
) {
this.state = State.IDLE;
}
break;
case ErrorDetails.BUFFER_FULL_ERROR:
// if in appending state
if (
data.parent === 'main' &&
(this.state === State.PARSING || this.state === State.PARSED)
) {
let flushBuffer = true;
const bufferedInfo = this.getFwdBufferInfo(
this.media,
PlaylistLevelType.MAIN
);
// 0.5 : tolerance needed as some browsers stalls playback before reaching buffered end
// reduce max buf len if current position is buffered
if (bufferedInfo && bufferedInfo.len > 0.5) {
flushBuffer = !this.reduceMaxBufferLength(bufferedInfo.len);
}
if (flushBuffer) {
// current position is not buffered, but browser is still complaining about buffer full error
// this happens on IE/Edge, refer to https://github.com/video-dev/hls.js/pull/708
// in that case flush the whole buffer to recover
this.warn(
'buffer full error also media.currentTime is not buffered, flush main'
);
// flush main buffer
this.immediateLevelSwitch();
}
this.resetLoadingState();
if (!data.parent || data.parent !== 'main') {
return;
}
if (this.reduceLengthAndFlushBuffer(data)) {
this.flushMainBuffer(0, Number.POSITIVE_INFINITY);
}
break;
case ErrorDetails.INTERNAL_EXCEPTION:
this.recoverWorkerError(data);
break;
default:
break;
@ -1025,14 +1031,14 @@ export default class StreamController
return audioCodec;
}
private _loadBitrateTestFrag(frag: Fragment, levelDetails: LevelDetails) {
private _loadBitrateTestFrag(frag: Fragment, level: Level) {
frag.bitrateTest = true;
this._doFragLoad(frag, levelDetails).then((data) => {
this._doFragLoad(frag, level).then((data) => {
const { hls } = this;
if (!data || this.fragContextChanged(frag)) {
return;
}
this.fragLoadError = 0;
level.fragmentError = 0;
this.state = State.IDLE;
this.startFragRequested = false;
this.bitrateTest = false;
@ -1055,10 +1061,7 @@ export default class StreamController
const context = this.getCurrentContext(chunkMeta);
if (!context) {
this.warn(
`The loading context changed while buffering fragment ${chunkMeta.sn} of level ${chunkMeta.level}. This chunk will not be buffered.`
);
this.resetStartWhenNotLoaded(chunkMeta.level);
this.resetWhenMissingContext(chunkMeta);
return;
}
const { frag, part, level } = context;
@ -1070,16 +1073,23 @@ export default class StreamController
// Check if the current fragment has been aborted. We check this by first seeing if we're still playing the current level.
// If we are, subsequently check if the currently loading fragment (fragCurrent) has changed.
if (this.fragContextChanged(frag)) {
this.fragmentTracker.removeFragment(frag);
return;
}
this.state = State.PARSING;
if (initSegment) {
if (initSegment.tracks) {
this._bufferInitSegment(level, initSegment.tracks, frag, chunkMeta);
if (initSegment?.tracks) {
const mapFragment = frag.initSegment || frag;
this._bufferInitSegment(
level,
initSegment.tracks,
mapFragment,
chunkMeta
);
hls.trigger(Events.FRAG_PARSING_INIT_SEGMENT, {
frag,
frag: mapFragment,
id,
tracks: initSegment.tracks,
});
@ -1089,7 +1099,7 @@ export default class StreamController
const initPTS = initSegment.initPTS as number;
const timescale = initSegment.timescale as number;
if (Number.isFinite(initPTS)) {
this.initPTS[frag.cc] = initPTS;
this.initPTS[frag.cc] = { baseTime: initPTS, timescale };
hls.trigger(Events.INIT_PTS_FOUND, { frag, id, initPTS, timescale });
}
}

View file

@ -7,6 +7,8 @@ import { FragmentState } from './fragment-tracker';
import BaseStreamController, { State } from './base-stream-controller';
import { PlaylistLevelType } from '../types/loader';
import { Level } from '../types/level';
import { subtitleOptionsIdentical } from '../utils/media-option-attributes';
import { ErrorDetails, ErrorTypes } from '../errors';
import type { NetworkComponentAPI } from '../types/component-api';
import type Hls from '../hls';
import type { FragmentTracker } from './fragment-tracker';
@ -47,7 +49,13 @@ export class SubtitleStreamController
fragmentTracker: FragmentTracker,
keyLoader: KeyLoader
) {
super(hls, fragmentTracker, keyLoader, '[subtitle-stream-controller]');
super(
hls,
fragmentTracker,
keyLoader,
'[subtitle-stream-controller]',
PlaylistLevelType.SUBTITLE
);
this._registerListeners();
}
@ -105,6 +113,11 @@ export class SubtitleStreamController
this.fragmentTracker.removeAllFragments();
}
onMediaDetaching(): void {
this.tracksBuffered = [];
super.onMediaDetaching();
}
onLevelLoaded(event: Events.LEVEL_LOADED, data: LevelLoadedData) {
this.mainDetails = data.details;
}
@ -199,16 +212,15 @@ export class SubtitleStreamController
// If something goes wrong, proceed to next frag, if we were processing one.
onError(event: Events.ERROR, data: ErrorData) {
const frag = data.frag;
// don't handle error not related to subtitle fragment
if (!frag || frag.type !== PlaylistLevelType.SUBTITLE) {
return;
}
if (this.fragCurrent) {
this.fragCurrent.abortRequests();
if (frag?.type === PlaylistLevelType.SUBTITLE) {
if (this.fragCurrent) {
this.fragCurrent.abortRequests();
}
if (this.state !== State.STOPPED) {
this.state = State.IDLE;
}
}
this.state = State.IDLE;
}
// Got all new subtitle levels.
@ -216,15 +228,24 @@ export class SubtitleStreamController
event: Events.SUBTITLE_TRACKS_UPDATED,
{ subtitleTracks }: SubtitleTracksUpdatedData
) {
if (subtitleOptionsIdentical(this.levels, subtitleTracks)) {
this.levels = subtitleTracks.map(
(mediaPlaylist) => new Level(mediaPlaylist)
);
return;
}
this.tracksBuffered = [];
this.levels = subtitleTracks.map(
(mediaPlaylist) => new Level(mediaPlaylist)
);
this.fragmentTracker.removeAllFragments();
this.fragPrevious = null;
this.levels.forEach((level: Level) => {
this.levels = subtitleTracks.map((mediaPlaylist) => {
const level = new Level(mediaPlaylist);
this.tracksBuffered[level.id] = [];
return level;
});
this.fragmentTracker.removeFragmentsInRange(
0,
Number.POSITIVE_INFINITY,
PlaylistLevelType.SUBTITLE
);
this.fragPrevious = null;
this.mediaBuffer = null;
}
@ -346,6 +367,17 @@ export class SubtitleStreamController
decryptData.key.buffer,
decryptData.iv.buffer
)
.catch((err) => {
hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.FRAG_DECRYPT_ERROR,
fatal: false,
error: err,
reason: err.message,
frag,
});
throw err;
})
.then((decryptedData) => {
const endTime = performance.now();
hls.trigger(Events.FRAG_DECRYPTED, {
@ -372,16 +404,13 @@ export class SubtitleStreamController
if (this.state === State.IDLE) {
const { currentTrackId, levels } = this;
if (
!levels.length ||
!levels[currentTrackId] ||
!levels[currentTrackId].details
) {
const track = levels[currentTrackId];
if (!levels.length || !track || !track.details) {
return;
}
// Expand range of subs loaded by one target-duration in either direction to make up for misaligned playlists
const trackDetails = levels[currentTrackId].details as LevelDetails;
const trackDetails = track.details as LevelDetails;
const targetDuration = trackDetails.targetduration;
const { config } = this;
const currentTime = this.getLoadPosition();
@ -402,11 +431,6 @@ export class SubtitleStreamController
if (bufferLen > maxBufLen) {
return;
}
console.assert(
trackDetails,
'Subtitle track details are defined on idle subtitle stream controller tick'
);
const fragments = trackDetails.fragments;
const fragLen = fragments.length;
const end = trackDetails.edge;
@ -440,7 +464,7 @@ export class SubtitleStreamController
this.fragmentTracker.getState(foundFrag) === FragmentState.NOT_LOADED
) {
// only load if fragment is not loaded
this.loadFragment(foundFrag, trackDetails, targetBufferTime);
this.loadFragment(foundFrag, track, targetBufferTime);
}
}
}
@ -455,15 +479,15 @@ export class SubtitleStreamController
protected loadFragment(
frag: Fragment,
levelDetails: LevelDetails,
level: Level,
targetBufferTime: number
) {
this.fragCurrent = frag;
if (frag.sn === 'initSegment') {
this._loadInitSegment(frag, levelDetails);
this._loadInitSegment(frag, level);
} else {
this.startFragRequested = true;
super.loadFragment(frag, levelDetails, targetBufferTime);
super.loadFragment(frag, level, targetBufferTime);
}
}

View file

@ -175,7 +175,6 @@ class SubtitleTrackController extends BasePlaylistController {
);
if (id === this.trackId) {
this.retryCount = 0;
this.playlistLoaded(id, data, curDetails);
}
}
@ -199,20 +198,18 @@ class SubtitleTrackController extends BasePlaylistController {
if (!levelInfo?.textGroupIds) {
return;
}
const textGroupId = levelInfo.textGroupIds[levelInfo.urlId];
const lastTrack = this.tracksInGroup
? this.tracksInGroup[this.trackId]
: undefined;
if (this.groupId !== textGroupId) {
const lastTrack = this.tracksInGroup
? this.tracksInGroup[this.trackId]
: undefined;
const subtitleTracks = this.tracks.filter(
(track): boolean => !textGroupId || track.groupId === textGroupId
);
this.tracksInGroup = subtitleTracks;
const initialTrackId =
this.findTrackId(lastTrack?.name) || this.findTrackId();
this.groupId = textGroupId;
this.groupId = textGroupId || null;
const subtitleTracksUpdated: SubtitleTracksUpdatedData = {
subtitleTracks,
@ -225,6 +222,9 @@ class SubtitleTrackController extends BasePlaylistController {
if (initialTrackId !== -1) {
this.setSubtitleTrack(initialTrackId, lastTrack);
}
} else if (this.shouldReloadPlaylist(lastTrack)) {
// Retry playlist loading if no playlist is or has been loaded yet
this.setSubtitleTrack(this.trackId, lastTrack);
}
}
@ -242,7 +242,6 @@ class SubtitleTrackController extends BasePlaylistController {
}
protected onError(event: Events.ERROR, data: ErrorData): void {
super.onError(event, data);
if (data.fatal || !data.context) {
return;
}
@ -252,7 +251,7 @@ class SubtitleTrackController extends BasePlaylistController {
data.context.id === this.trackId &&
data.context.groupId === this.groupId
) {
this.retryLoadingOrFail(data);
this.checkRetry(data);
}
}
@ -277,7 +276,7 @@ class SubtitleTrackController extends BasePlaylistController {
protected loadPlaylist(hlsUrlParameters?: HlsUrlParameters): void {
super.loadPlaylist();
const currentTrack = this.tracksInGroup[this.trackId];
if (this.shouldLoadTrack(currentTrack)) {
if (this.shouldLoadPlaylist(currentTrack)) {
const id = currentTrack.id;
const groupId = currentTrack.groupId as string;
let url = currentTrack.url;
@ -368,7 +367,13 @@ class SubtitleTrackController extends BasePlaylistController {
this.clearTimer();
const track = tracks[newId];
this.log(`Switching to subtitle track ${newId}`);
this.log(
`Switching to subtitle-track ${newId}` +
(track
? ` "${track.name}" lang:${track.lang} group:${track.groupId}`
: '')
);
this.trackId = newId;
if (track) {
const { id, groupId = '', name, type, url } = track;
@ -420,7 +425,10 @@ function filterSubtitleTracks(textTrackList: TextTrackList): TextTrack[] {
for (let i = 0; i < textTrackList.length; i++) {
const track = textTrackList[i];
// Edge adds a track without a label; we don't want to use it
if (track.kind === 'subtitles' && track.label) {
if (
(track.kind === 'subtitles' || track.kind === 'captions') &&
track.label
) {
tracks.push(textTrackList[i]);
}
}

View file

@ -8,6 +8,7 @@ import {
addCueToTrack,
removeCuesInRange,
} from '../utils/texttrack-utils';
import { subtitleOptionsIdentical } from '../utils/media-option-attributes';
import { parseIMSC1, IMSC1_CODEC } from '../utils/imsc1-ttml-parser';
import { appendUint8Array } from '../utils/mp4-tools';
import { PlaylistLevelType } from '../types/loader';
@ -30,6 +31,7 @@ import type { HlsConfig } from '../config';
import type { CuesInterface } from '../utils/cues';
import type { MediaPlaylist } from '../types/media-playlist';
import type { VTTCCs } from '../types/vtt';
import type { RationalTimestamp } from '../utils/timescale-conversion';
type TrackProperties = {
label: string;
@ -54,8 +56,7 @@ export class TimelineController implements ComponentAPI {
private Cues: CuesInterface;
private textTracks: Array<TextTrack> = [];
private tracks: Array<MediaPlaylist> = [];
private initPTS: Array<number> = [];
private timescale: Array<number> = [];
private initPTS: RationalTimestamp[] = [];
private unparsedVttFrags: Array<FragLoadedData | FragDecryptedData> = [];
private captionsTracks: Record<string, TextTrack> = {};
private nonNativeCaptionsTracks: Record<string, NonNativeCaptionsTrack> = {};
@ -187,8 +188,7 @@ export class TimelineController implements ComponentAPI {
) {
const { unparsedVttFrags } = this;
if (id === 'main') {
this.initPTS[frag.cc] = initPTS;
this.timescale[frag.cc] = timescale;
this.initPTS[frag.cc] = { baseTime: initPTS, timescale };
}
// Due to asynchronous processing, initial PTS may arrive later than the first VTT fragments are loaded.
@ -306,7 +306,6 @@ export class TimelineController implements ComponentAPI {
this.textTracks = [];
this.unparsedVttFrags = this.unparsedVttFrags || [];
this.initPTS = [];
this.timescale = [];
if (this.cea608Parser1 && this.cea608Parser2) {
this.cea608Parser1.reset();
this.cea608Parser2.reset();
@ -331,20 +330,23 @@ export class TimelineController implements ComponentAPI {
event: Events.SUBTITLE_TRACKS_UPDATED,
data: SubtitleTracksUpdatedData
) {
this.textTracks = [];
const tracks: Array<MediaPlaylist> = data.subtitleTracks || [];
const hasIMSC1 = tracks.some((track) => track.textCodec === IMSC1_CODEC);
if (this.config.enableWebVTT || (hasIMSC1 && this.config.enableIMSC1)) {
const sameTracks =
this.tracks && tracks && this.tracks.length === tracks.length;
this.tracks = tracks || [];
const listIsIdentical = subtitleOptionsIdentical(this.tracks, tracks);
if (listIsIdentical) {
this.tracks = tracks;
return;
}
this.textTracks = [];
this.tracks = tracks;
if (this.config.renderTextTracksNatively) {
const inUseTracks = this.media ? this.media.textTracks : [];
const inUseTracks = this.media ? this.media.textTracks : null;
this.tracks.forEach((track, index) => {
let textTrack: TextTrack | undefined;
if (index < inUseTracks.length) {
if (inUseTracks && index < inUseTracks.length) {
let inUseTrack: TextTrack | null = null;
for (let i = 0; i < inUseTracks.length; i++) {
@ -378,7 +380,7 @@ export class TimelineController implements ComponentAPI {
this.textTracks.push(textTrack);
}
});
} else if (!sameTracks && this.tracks && this.tracks.length) {
} else if (this.tracks.length) {
// Create a list of tracks for the provider to consume
const tracksList = this.tracks.map((track) => {
return {
@ -398,7 +400,7 @@ export class TimelineController implements ComponentAPI {
private _captionsOrSubtitlesFromCharacteristics(
track: MediaPlaylist
): TextTrackKind {
if (track.attrs?.CHARACTERISTICS) {
if (track.attrs.CHARACTERISTICS) {
const transcribesSpokenDialog = /transcribes-spoken-dialog/gi.test(
track.attrs.CHARACTERISTICS
);
@ -480,7 +482,7 @@ export class TimelineController implements ComponentAPI {
// If fragment is subtitle type, parse as WebVTT.
if (payload.byteLength) {
// We need an initial synchronisation PTS. Store fragments as long as none has arrived.
if (!Number.isFinite(initPTS[frag.cc])) {
if (!initPTS[frag.cc]) {
unparsedVttFrags.push(data);
if (initPTS.length) {
// finish unsuccessfully, otherwise the subtitle-stream-controller could be blocked from loading new frags.
@ -533,7 +535,6 @@ export class TimelineController implements ComponentAPI {
parseIMSC1(
payload,
this.initPTS[frag.cc],
this.timescale[frag.cc],
(cues) => {
this._appendCues(cues, frag.level);
hls.trigger(Events.SUBTITLE_FRAG_PROCESSED, {
@ -561,7 +562,6 @@ export class TimelineController implements ComponentAPI {
parseWebVTT(
payloadWebVTT,
this.initPTS[frag.cc],
this.timescale[frag.cc],
vttCCs,
frag.cc,
frag.start,
@ -592,7 +592,6 @@ export class TimelineController implements ComponentAPI {
parseIMSC1(
payload,
this.initPTS[frag.cc],
this.timescale[frag.cc],
() => {
trackPlaylistMedia.textCodec = IMSC1_CODEC;
this._parseIMSC1(frag, payload);
@ -632,7 +631,7 @@ export class TimelineController implements ComponentAPI {
) {
const { frag } = data;
if (frag.type === PlaylistLevelType.SUBTITLE) {
if (!Number.isFinite(this.initPTS[frag.cc])) {
if (!this.initPTS[frag.cc]) {
this.unparsedVttFrags.push(data as unknown as FragLoadedData);
return;
}
@ -733,9 +732,12 @@ export class TimelineController implements ComponentAPI {
}
}
function canReuseVttTextTrack(inUseTrack, manifestTrack): boolean {
function canReuseVttTextTrack(
inUseTrack: (TextTrack & { textTrack1?; textTrack2? }) | null,
manifestTrack: MediaPlaylist
): boolean {
return (
inUseTrack &&
!!inUseTrack &&
inUseTrack.label === manifestTrack.name &&
!(inUseTrack.textTrack1 || inUseTrack.textTrack2)
);

View file

@ -5,3 +5,10 @@ declare const __USE_ALT_AUDIO__: boolean;
declare const __USE_EME_DRM__: boolean;
declare const __USE_SUBTITLES__: boolean;
declare const __USE_CMCD__: boolean;
declare const __USE_CONTENT_STEERING__: boolean;
declare const __USE_VARIABLE_SUBSTITUTION__: boolean;
// __IN_WORKER__ is provided from a closure call around the final UMD bundle.
declare const __IN_WORKER__: boolean;
// __HLS_WORKER_BUNDLE__ is the name of the closure around the final UMD bundle.
declare const __HLS_WORKER_BUNDLE__: Function;

View file

@ -13,6 +13,7 @@ import {
import { dummyTrack } from './dummy-demuxed-track';
import { appendUint8Array } from '../utils/mp4-tools';
import { sliceUint8 } from '../utils/typed-array';
import { RationalTimestamp } from '../utils/timescale-conversion';
class BaseAudioDemuxer implements Demuxer {
protected _audioTrack!: DemuxedAudioTrack;
@ -20,7 +21,7 @@ class BaseAudioDemuxer implements Demuxer {
protected frameIndex: number = 0;
protected cachedData: Uint8Array | null = null;
protected basePTS: number | null = null;
protected initPTS: number | null = null;
protected initPTS: RationalTimestamp | null = null;
protected lastPTS: number | null = null;
resetInitSegment(
@ -40,7 +41,7 @@ class BaseAudioDemuxer implements Demuxer {
};
}
resetTimeStamp(deaultTimestamp) {
resetTimeStamp(deaultTimestamp: RationalTimestamp | null) {
this.initPTS = deaultTimestamp;
this.resetContiguity();
}
@ -181,11 +182,14 @@ class BaseAudioDemuxer implements Demuxer {
export const initPTSFn = (
timestamp: number | undefined,
timeOffset: number,
initPTS: number | null
initPTS: RationalTimestamp | null
): number => {
if (Number.isFinite(timestamp as number)) {
return timestamp! * 90;
}
return timeOffset * 90000 + (initPTS || 0);
const init90kHz = initPTS
? (initPTS.baseTime * 90000) / initPTS.timescale
: 0;
return timeOffset * 90000 + init90kHz;
};
export default BaseAudioDemuxer;

View file

@ -173,8 +173,7 @@ class ExpGolomb {
* Read a sequence parameter set and return some interesting video
* properties. A sequence parameter set is the H264 metadata that
* describes the properties of upcoming video frames.
* @param data {Uint8Array} the bytes of a sequence parameter set
* @return {object} an object with configuration parsed from the
* @returns an object with configuration parsed from the
* sequence parameter set, including the dimensions of the
* associated video frames.
*/

23
node_modules/hls.js/src/demux/id3.ts generated vendored
View file

@ -6,9 +6,8 @@ export type Frame = DecodedFrame<ArrayBuffer | string>;
/**
* Returns true if an ID3 header can be found at offset in data
* @param {Uint8Array} data - The data to search in
* @param {number} offset - The offset at which to start searching
* @return {boolean} - True if an ID3 header is found
* @param data - The data to search
* @param offset - The offset at which to start searching
*/
export const isHeader = (data: Uint8Array, offset: number): boolean => {
/*
@ -51,9 +50,8 @@ export const isHeader = (data: Uint8Array, offset: number): boolean => {
/**
* Returns true if an ID3 footer can be found at offset in data
* @param {Uint8Array} data - The data to search in
* @param {number} offset - The offset at which to start searching
* @return {boolean} - True if an ID3 footer is found
* @param data - The data to search
* @param offset - The offset at which to start searching
*/
export const isFooter = (data: Uint8Array, offset: number): boolean => {
/*
@ -86,9 +84,9 @@ export const isFooter = (data: Uint8Array, offset: number): boolean => {
/**
* Returns any adjacent ID3 tags found in data starting at offset, as one block of data
* @param {Uint8Array} data - The data to search in
* @param {number} offset - The offset at which to start searching
* @return {Uint8Array | undefined} - The block of data containing any ID3 tags found
* @param data - The data to search in
* @param offset - The offset at which to start searching
* @returns the block of data containing any ID3 tags found
* or *undefined* if no header is found at the starting offset
*/
export const getID3Data = (
@ -138,8 +136,7 @@ export const canParse = (data: Uint8Array, offset: number): boolean => {
/**
* Searches for the Elementary Stream timestamp found in the ID3 data chunk
* @param {Uint8Array} data - Block of data containing one or more ID3 tags
* @return {number | undefined} - The timestamp
* @param data - Block of data containing one or more ID3 tags
*/
export const getTimeStamp = (data: Uint8Array): number | undefined => {
const frames: Frame[] = getID3Frames(data);
@ -157,7 +154,6 @@ export const getTimeStamp = (data: Uint8Array): number | undefined => {
/**
* Returns true if the ID3 frame is an Elementary Stream timestamp frame
* @param {ID3 frame} frame
*/
export const isTimeStampFrame = (frame: Frame): boolean => {
return (
@ -184,8 +180,7 @@ const getFrameData = (data: Uint8Array): RawFrame => {
/**
* Returns an array of ID3 frames found in all the ID3 tags in the id3Data
* @param {Uint8Array} id3Data - The ID3 data containing one or more ID3 tags
* @return {ID3.Frame[]} - Array of ID3 frame objects
* @param id3Data - The ID3 data containing one or more ID3 tags
*/
export const getID3Frames = (id3Data: Uint8Array): Frame[] => {
let offset = 0;

View file

@ -63,7 +63,7 @@ class MP4Demuxer implements Demuxer {
this.id3Track = dummyTrack('id3', 1) as DemuxedMetadataTrack;
this.timeOffset = 0;
if (!initSegment || !initSegment.byteLength) {
if (!initSegment?.byteLength) {
return;
}
const initData = parseInitSegment(initSegment);
@ -87,7 +87,9 @@ class MP4Demuxer implements Demuxer {
videoTrack.duration = audioTrack.duration = trackDuration;
}
public resetContiguity(): void {}
public resetContiguity(): void {
this.remainderData = null;
}
static probe(data: Uint8Array) {
// ensure we find a moof box in the first 16 kB

View file

@ -1,4 +1,9 @@
import work from './webworkify-webpack';
import {
WorkerContext,
hasUMDWorker,
injectWorker,
loadWorker,
} from './inject-worker';
import { Events } from '../events';
import Transmuxer, {
TransmuxConfig,
@ -15,17 +20,19 @@ import type Hls from '../hls';
import type { HlsEventEmitter } from '../events';
import type { PlaylistLevelType } from '../types/loader';
import type { TypeSupported } from './tsdemuxer';
import type { RationalTimestamp } from '../utils/timescale-conversion';
const MediaSource = getMediaSource() || { isTypeSupported: () => false };
export default class TransmuxerInterface {
public error: Error | null = null;
private hls: Hls;
private id: PlaylistLevelType;
private observer: HlsEventEmitter;
private frag: Fragment | null = null;
private part: Part | null = null;
private useWorker: boolean;
private worker: any;
private workerContext: WorkerContext | null = null;
private onwmsg?: Function;
private transmuxer: Transmuxer | null = null;
private onTransmuxComplete: (transmuxResult: TransmuxerResult) => void;
@ -48,6 +55,9 @@ export default class TransmuxerInterface {
data = data || {};
data.frag = this.frag;
data.id = this.id;
if (ev === Events.ERROR) {
this.error = data.error;
}
this.hls.trigger(ev, data);
};
@ -65,69 +75,85 @@ export default class TransmuxerInterface {
// refer to https://developer.mozilla.org/en-US/docs/Web/API/WorkerGlobalScope/navigator
const vendor = navigator.vendor;
if (this.useWorker && typeof Worker !== 'undefined') {
logger.log('demuxing in webworker');
let worker;
try {
worker = this.worker = work(
require.resolve('../demux/transmuxer-worker.ts')
);
this.onwmsg = this.onWorkerMessage.bind(this);
worker.addEventListener('message', this.onwmsg);
worker.onerror = (event) => {
this.useWorker = false;
logger.warn('Exception in webworker, fallback to inline');
this.hls.trigger(Events.ERROR, {
type: ErrorTypes.OTHER_ERROR,
details: ErrorDetails.INTERNAL_EXCEPTION,
fatal: false,
event: 'demuxerWorker',
error: new Error(
const canCreateWorker = config.workerPath || hasUMDWorker();
if (canCreateWorker) {
try {
if (config.workerPath) {
logger.log(`loading Web Worker ${config.workerPath} for "${id}"`);
this.workerContext = loadWorker(config.workerPath);
} else {
logger.log(`injecting Web Worker for "${id}"`);
this.workerContext = injectWorker();
}
this.onwmsg = (ev: any) => this.onWorkerMessage(ev);
const { worker } = this.workerContext;
worker.addEventListener('message', this.onwmsg as any);
worker.onerror = (event) => {
const error = new Error(
`${event.message} (${event.filename}:${event.lineno})`
),
);
config.enableWorker = false;
logger.warn(`Error in "${id}" Web Worker, fallback to inline`);
this.hls.trigger(Events.ERROR, {
type: ErrorTypes.OTHER_ERROR,
details: ErrorDetails.INTERNAL_EXCEPTION,
fatal: false,
event: 'demuxerWorker',
error,
});
};
worker.postMessage({
cmd: 'init',
typeSupported: typeSupported,
vendor: vendor,
id: id,
config: JSON.stringify(config),
});
};
worker.postMessage({
cmd: 'init',
typeSupported: typeSupported,
vendor: vendor,
id: id,
config: JSON.stringify(config),
});
} catch (err) {
logger.warn('Error in worker:', err);
logger.error(
'Error while initializing DemuxerWorker, fallback to inline'
);
if (worker) {
// revoke the Object URL that was used to create transmuxer worker, so as not to leak it
self.URL.revokeObjectURL(worker.objectURL);
} catch (err) {
logger.warn(
`Error setting up "${id}" Web Worker, fallback to inline`,
err
);
this.resetWorker();
this.error = null;
this.transmuxer = new Transmuxer(
this.observer,
typeSupported,
config,
vendor,
id
);
}
this.transmuxer = new Transmuxer(
this.observer,
typeSupported,
config,
vendor,
id
);
this.worker = null;
return;
}
} else {
this.transmuxer = new Transmuxer(
this.observer,
typeSupported,
config,
vendor,
id
);
}
this.transmuxer = new Transmuxer(
this.observer,
typeSupported,
config,
vendor,
id
);
}
resetWorker(): void {
if (this.workerContext) {
const { worker, objectURL } = this.workerContext;
if (objectURL) {
// revoke the Object URL that was used to create transmuxer worker, so as not to leak it
self.URL.revokeObjectURL(objectURL);
}
worker.removeEventListener('message', this.onwmsg as any);
worker.onerror = null;
worker.terminate();
this.workerContext = null;
}
}
destroy(): void {
const w = this.worker;
if (w) {
w.removeEventListener('message', this.onwmsg);
w.terminate();
this.worker = null;
if (this.workerContext) {
this.resetWorker();
this.onwmsg = undefined;
} else {
const transmuxer = this.transmuxer;
@ -157,10 +183,10 @@ export default class TransmuxerInterface {
duration: number,
accurateTimeOffset: boolean,
chunkMeta: ChunkMetadata,
defaultInitPTS?: number
defaultInitPTS?: RationalTimestamp
): void {
chunkMeta.transmuxing.start = self.performance.now();
const { transmuxer, worker } = this;
const { transmuxer } = this;
const timeOffset = part ? part.start : frag.start;
// TODO: push "clear-lead" decrypt data for unencrypted fragments in streams with encrypted ones
const decryptdata = frag.decryptdata;
@ -219,9 +245,9 @@ export default class TransmuxerInterface {
this.part = part;
// Frags with sn of 'initSegment' are not transmuxed
if (worker) {
if (this.workerContext) {
// post fragment payload as transferable objects for ArrayBuffer (no copy)
worker.postMessage(
this.workerContext.worker.postMessage(
{
cmd: 'demux',
data,
@ -260,10 +286,10 @@ export default class TransmuxerInterface {
flush(chunkMeta: ChunkMetadata) {
chunkMeta.transmuxing.start = self.performance.now();
const { transmuxer, worker } = this;
if (worker) {
const { transmuxer } = this;
if (this.workerContext) {
1;
worker.postMessage({
this.workerContext.worker.postMessage({
cmd: 'flush',
chunkMeta,
});
@ -302,6 +328,7 @@ export default class TransmuxerInterface {
if (!this.hls) {
return;
}
this.error = error;
this.hls.trigger(Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.FRAG_PARSING_ERROR,
@ -328,8 +355,11 @@ export default class TransmuxerInterface {
const hls = this.hls;
switch (data.event) {
case 'init': {
// revoke the Object URL that was used to create transmuxer worker, so as not to leak it
self.URL.revokeObjectURL(this.worker.objectURL);
const objectURL = this.workerContext?.objectURL;
if (objectURL) {
// revoke the Object URL that was used to create transmuxer worker, so as not to leak it
self.URL.revokeObjectURL(objectURL);
}
break;
}
@ -361,9 +391,9 @@ export default class TransmuxerInterface {
}
private configureTransmuxer(config: TransmuxConfig) {
const { worker, transmuxer } = this;
if (worker) {
worker.postMessage({
const { transmuxer } = this;
if (this.workerContext) {
this.workerContext.worker.postMessage({
cmd: 'configure',
config,
});

View file

@ -2,11 +2,15 @@ import Transmuxer, { isPromise } from '../demux/transmuxer';
import { Events } from '../events';
import { ILogFunction, enableLogs, logger } from '../utils/logger';
import { EventEmitter } from 'eventemitter3';
import { ErrorDetails, ErrorTypes } from '../errors';
import type { RemuxedTrack, RemuxerResult } from '../types/remuxer';
import type { TransmuxerResult, ChunkMetadata } from '../types/transmuxer';
import { ErrorDetails, ErrorTypes } from '../errors';
export default function TransmuxerWorker(self) {
if (typeof __IN_WORKER__ !== 'undefined' && __IN_WORKER__) {
startWorker(self);
}
function startWorker(self) {
const observer = new EventEmitter();
const forwardMessage = (ev, data) => {
self.postMessage({ event: ev, data: data });

View file

@ -15,6 +15,7 @@ import type { TransmuxerResult, ChunkMetadata } from '../types/transmuxer';
import type { HlsConfig } from '../config';
import type { DecryptData } from '../loader/level-key';
import type { PlaylistLevelType } from '../types/loader';
import type { RationalTimestamp } from '../utils/timescale-conversion';
let now;
// performance.now() not available on WebWorker, at least on Safari Desktop
@ -22,7 +23,7 @@ try {
now = self.performance.now.bind(self.performance);
} catch (err) {
logger.debug('Unable to use Performance API on this environment');
now = self.Date.now;
now = typeof self !== 'undefined' && self.Date.now;
}
type MuxConfig =
@ -147,7 +148,19 @@ export default class Transmuxer {
const resetMuxers = this.needsProbing(discontinuity, trackSwitch);
if (resetMuxers) {
this.configureTransmuxer(uintData);
const error = this.configureTransmuxer(uintData);
if (error) {
logger.warn(`[transmuxer] ${error.message}`);
this.observer.emit(Events.ERROR, Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.FRAG_PARSING_ERROR,
fatal: false,
error,
reason: error.message,
});
stats.executeEnd = now();
return emptyResult(chunkMeta);
}
}
if (discontinuity || trackSwitch || initSegmentChange || resetMuxers) {
@ -220,12 +233,6 @@ export default class Transmuxer {
const { demuxer, remuxer } = this;
if (!demuxer || !remuxer) {
// If probing failed, then Hls.js has been given content its not able to handle
this.observer.emit(Events.ERROR, Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.FRAG_PARSING_ERROR,
fatal: true,
reason: 'no demux matching with content found',
});
stats.executeEnd = now();
return [emptyResult(chunkMeta)];
}
@ -273,7 +280,7 @@ export default class Transmuxer {
chunkMeta.transmuxing.executeEnd = now();
}
resetInitialTimestamp(defaultInitPts: number | undefined) {
resetInitialTimestamp(defaultInitPts: RationalTimestamp | null) {
const { demuxer, remuxer } = this;
if (!demuxer || !remuxer) {
return;
@ -406,7 +413,7 @@ export default class Transmuxer {
});
}
private configureTransmuxer(data: Uint8Array) {
private configureTransmuxer(data: Uint8Array): void | Error {
const { config, observer, typeSupported, vendor } = this;
// probe for content type
let mux;
@ -417,11 +424,7 @@ export default class Transmuxer {
}
}
if (!mux) {
// If probing previous configs fail, use mp4 passthrough
logger.warn(
'Failed to find demuxer by probing frag, treating as mp4 passthrough'
);
mux = { demux: MP4Demuxer, remux: PassThroughRemuxer };
return new Error('Failed to find demuxer by probing fragment data');
}
// so let's check that current remuxer and demuxer are still valid
const demuxer = this.demuxer;
@ -483,20 +486,20 @@ export class TransmuxConfig {
public videoCodec?: string;
public initSegmentData?: Uint8Array;
public duration: number;
public defaultInitPts?: number;
public defaultInitPts: RationalTimestamp | null;
constructor(
audioCodec: string | undefined,
videoCodec: string | undefined,
initSegmentData: Uint8Array | undefined,
duration: number,
defaultInitPts?: number
defaultInitPts?: RationalTimestamp
) {
this.audioCodec = audioCodec;
this.videoCodec = videoCodec;
this.initSegmentData = initSegmentData;
this.duration = duration;
this.defaultInitPts = defaultInitPts;
this.defaultInitPts = defaultInitPts || null;
}
}

View file

@ -99,20 +99,44 @@ class TSDemuxer implements Demuxer {
}
static syncOffset(data: Uint8Array): number {
const scanwindow =
const length = data.length;
let scanwindow =
Math.min(PACKET_LENGTH * 5, data.length - PACKET_LENGTH) + 1;
let i = 0;
while (i < scanwindow) {
// a TS init segment should contain at least 2 TS packets: PAT and PMT, each starting with 0x47
let foundPat = false;
for (let j = 0; j < scanwindow; j += PACKET_LENGTH) {
let packetStart = -1;
let tsPackets = 0;
for (let j = i; j < length; j += PACKET_LENGTH) {
if (data[j] === 0x47) {
if (!foundPat && parsePID(data, j) === 0) {
foundPat = true;
tsPackets++;
if (packetStart === -1) {
packetStart = j;
// First sync word found at offset, increase scan length (#5251)
if (packetStart !== 0) {
scanwindow =
Math.min(
packetStart + PACKET_LENGTH * 99,
data.length - PACKET_LENGTH
) + 1;
}
}
if (foundPat && j + PACKET_LENGTH > scanwindow) {
return i;
if (!foundPat) {
foundPat = parsePID(data, j) === 0;
}
// Sync word found at 0 with 3 packets, or found at offset least 2 packets up to scanwindow (#5501)
if (
foundPat &&
tsPackets > 1 &&
((packetStart === 0 && tsPackets > 2) ||
j + PACKET_LENGTH > scanwindow)
) {
return packetStart;
}
} else if (tsPackets) {
// Exit if sync word found, but does not contain contiguous packets (#5501)
return -1;
} else {
break;
}
@ -124,10 +148,6 @@ class TSDemuxer implements Demuxer {
/**
* Creates a track model internal to demuxer used to drive remuxing input
*
* @param type 'audio' | 'video' | 'id3' | 'text'
* @param duration
* @return TSDemuxer's internal track model
*/
static createTrack(
type: 'audio' | 'video' | 'id3' | 'text',
@ -357,7 +377,9 @@ class TSDemuxer implements Demuxer {
}
if (unknownPID !== null && !pmtParsed) {
logger.log(`unknown PID '${unknownPID}' in TS found`);
logger.warn(
`MPEG-TS PMT found at ${start} after unknown PID '${unknownPID}'. Backtracking to sync byte @${syncOffset} to parse all TS packets.`
);
unknownPID = null;
// we set it to -188, the += 188 in the for loop will reset start to 0
start = syncOffset - 188;
@ -378,11 +400,15 @@ class TSDemuxer implements Demuxer {
}
if (tsPacketErrors > 0) {
const error = new Error(
`Found ${tsPacketErrors} TS packet/s that do not start with 0x47`
);
this.observer.emit(Events.ERROR, Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.FRAG_PARSING_ERROR,
fatal: false,
reason: `Found ${tsPacketErrors} TS packet/s that do not start with 0x47`,
error,
reason: error.message,
});
}
@ -626,15 +652,15 @@ class TSDemuxer implements Demuxer {
}
if (!track.sps) {
const expGolombDecoder = new ExpGolomb(unit.data);
const sps = unit.data;
const expGolombDecoder = new ExpGolomb(sps);
const config = expGolombDecoder.readSPS();
track.width = config.width;
track.height = config.height;
track.pixelRatio = config.pixelRatio;
// TODO: `track.sps` is defined as a `number[]`, but we're setting it to a `Uint8Array[]`.
track.sps = [unit.data] as any;
track.sps = [sps];
track.duration = this._duration;
const codecarray = unit.data.subarray(1, 4);
const codecarray = sps.subarray(1, 4);
let codecstring = 'avc1.';
for (let i = 0; i < 3; i++) {
let h = codecarray[i].toString(16);
@ -655,8 +681,7 @@ class TSDemuxer implements Demuxer {
}
if (!track.pps) {
// TODO: `track.pss` is defined as a `number[]`, but we're setting it to a `Uint8Array[]`.
track.pps = [unit.data] as any;
track.pps = [unit.data];
}
break;
@ -872,23 +897,24 @@ class TSDemuxer implements Demuxer {
}
// if ADTS header does not start straight from the beginning of the PES payload, raise an error
if (offset !== startOffset) {
let reason;
let fatal;
if (offset < len - 1) {
let reason: string;
const recoverable = offset < len - 1;
if (recoverable) {
reason = `AAC PES did not start with ADTS header,offset:${offset}`;
fatal = false;
} else {
reason = 'no ADTS header found in AAC PES';
fatal = true;
reason = 'No ADTS header found in AAC PES';
}
logger.warn(`parsing error:${reason}`);
const error = new Error(reason);
logger.warn(`parsing error: ${reason}`);
this.observer.emit(Events.ERROR, Events.ERROR, {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.FRAG_PARSING_ERROR,
fatal,
fatal: false,
levelRetry: recoverable,
error,
reason,
});
if (fatal) {
if (!recoverable) {
return;
}
}

View file

@ -1,198 +0,0 @@
/*
* Fork of webworkify-webpack with support for Webpack 5
* https://github.com/wupeng-engineer/webworkify-webpack/blob/db0de7/index.js
*/
const webpackBootstrapFunc = function () {// webpackBootstrap
/******/ var __webpack_modules__ = ENTRY_MODULE
/************************************************************************/
/******/ // The module cache
/******/ var __webpack_module_cache__ = {};
/******/
/******/ // The require function
/******/ var __webpack_require__ = function __webpack_require__(moduleId) {
/******/ // Check if module is in cache
/******/ var cachedModule = __webpack_module_cache__[moduleId];
/******/ if (cachedModule !== undefined) {
/******/ return cachedModule.exports;
/******/ }
/******/ // Create a new module (and put it into the cache)
/******/ var module = __webpack_module_cache__[moduleId] = {
/******/ // no module.id needed
/******/ // no module.loaded needed
/******/ exports: {}
/******/ };
/******/
/******/ // Execute the module function
/******/ __webpack_modules__[moduleId].call(module.exports, module, module.exports, __webpack_require__);
/******/
/******/ // Return the exports of the module
/******/ return module.exports;
/******/ }
/******/
/******/ // expose the modules object (__webpack_modules__)
/******/ __webpack_require__.m = __webpack_modules__;
/******/
/************************************************************************/
/******/ /* webpack/runtime/compat get default export */
/******/ (() => {
/******/ // getDefaultExport function for compatibility with non-harmony modules
/******/ __webpack_require__.n = (module) => {
/******/ var getter = module && module.__esModule ?
/******/ () => (module['default']) :
/******/ () => (module);
/******/ __webpack_require__.d(getter, { a: getter });
/******/ return getter;
/******/ };
/******/ })();
/******/
/******/ /* webpack/runtime/define property getters */
/******/ (() => {
/******/ // define getter functions for harmony exports
/******/ __webpack_require__.d = (exports, definition) => {
/******/ for(var key in definition) {
/******/ if(__webpack_require__.o(definition, key) && !__webpack_require__.o(exports, key)) {
/******/ Object.defineProperty(exports, key, { enumerable: true, get: definition[key] });
/******/ }
/******/ }
/******/ };
/******/ })();
/******/
/******/ /* webpack/runtime/hasOwnProperty shorthand */
/******/ (() => {
/******/ __webpack_require__.o = (obj, prop) => (Object.prototype.hasOwnProperty.call(obj, prop))
/******/ })();
/******/
/******/ /* webpack/runtime/make namespace object */
/******/ (() => {
/******/ // define __esModule on exports
/******/ __webpack_require__.r = (exports) => {
/******/ if(typeof Symbol !== 'undefined' && Symbol.toStringTag) {
/******/ Object.defineProperty(exports, Symbol.toStringTag, { value: 'Module' });
/******/ }
/******/ Object.defineProperty(exports, '__esModule', { value: true });
/******/ };
/******/ })();
/******/
/************************************************************************/
/******/
/******/ // module factories are used so entry inlining is disabled
/******/ // startup
/******/ // Load entry module and return exports
/******/ var result = __webpack_require__(ENTRY_MODULE)
/******/ return result.default || result
}
var webpackBootstrapFuncArr = webpackBootstrapFunc.toString().split('ENTRY_MODULE');
var moduleNameReqExp = '[\\.|\\-|\\+|\\w|\/|@]+';
var dependencyRegExp = '\\(\\s*(\/\\*.*?\\*\/)?\\s*.*?(' + moduleNameReqExp + ').*?\\)';
function quoteRegExp(str) {
return (str + '').replace(/[.?*+^$[\]\\(){}|-]/g, '\\$&');
}
function isNumeric(n) {
return !isNaN(1 * n);
}
function getModuleDependencies(sources, module, queueName) {
var retval = {};
retval[queueName] = [];
var fnString = module.toString().replace(/^"[^"]+"/,'function');;
var wrapperSignature = fnString.match(/^function\s?\w*\(\w+,\s*\w+,\s*(\w+)\)/) || fnString.match(/^\(\w+,\s*\w+,\s*(\w+)\)\s?\=\s?\>/);
if (!wrapperSignature) return retval;
var webpackRequireName = wrapperSignature[1];
var re = new RegExp('(\\\\n|\\W)' + quoteRegExp(webpackRequireName) + dependencyRegExp, 'g');
var match;
while ((match = re.exec(fnString))) {
if (match[3] === 'dll-reference') continue;
retval[queueName].push(match[3]);
}
re = new RegExp('\\(' + quoteRegExp(webpackRequireName) + '\\("(dll-reference\\s(' + moduleNameReqExp + '))"\\)\\)' + dependencyRegExp, 'g');
while ((match = re.exec(fnString))) {
if (!sources[match[2]]) {
retval[queueName].push(match[1]);
sources[match[2]] = __webpack_require__(match[1]).m;
}
retval[match[2]] = retval[match[2]] || [];
retval[match[2]].push(match[4]);
}
var keys = Object.keys(retval);
for (var i = 0; i < keys.length; i++) {
for (var j = 0; j < retval[keys[i]].length; j++) {
if (isNumeric(retval[keys[i]][j])) {
retval[keys[i]][j] = 1 * retval[keys[i]][j];
}
}
}
return retval;
}
function hasValuesInQueues(queues) {
var keys = Object.keys(queues);
return keys.reduce((hasValues, key) => hasValues || queues[key].length > 0, false);
}
function getRequiredModules(sources, moduleId) {
var modulesQueue = {
main: [moduleId]
};
var requiredModules = {
main: []
};
var seenModules = {
main: {}
};
while (hasValuesInQueues(modulesQueue)) {
var queues = Object.keys(modulesQueue);
for (var i = 0; i < queues.length; i++) {
var queueName = queues[i];
var queue = modulesQueue[queueName];
var moduleToCheck = queue.pop();
seenModules[queueName] = seenModules[queueName] || {};
if (seenModules[queueName][moduleToCheck] || !sources[queueName][moduleToCheck]) continue;
seenModules[queueName][moduleToCheck] = true;
requiredModules[queueName] = requiredModules[queueName] || [];
requiredModules[queueName].push(moduleToCheck);
var newModules = getModuleDependencies(sources, sources[queueName][moduleToCheck], queueName);
var newModulesKeys = Object.keys(newModules);
for (var j = 0; j < newModulesKeys.length; j++) {
modulesQueue[newModulesKeys[j]] = modulesQueue[newModulesKeys[j]] || [];
modulesQueue[newModulesKeys[j]] = modulesQueue[newModulesKeys[j]].concat(newModules[newModulesKeys[j]]);
}
}
}
return requiredModules;
}
function getWebpackString(requiredModules, sources, entryModule, key) {
const moduleString = requiredModules[key].map((id) => `"${id}": ${sources[key][id].toString().replace(/^"[^"]+"/,'function')}`).join(",");
return `${webpackBootstrapFuncArr[0]}{${moduleString}}${webpackBootstrapFuncArr[1]}"${entryModule}"${webpackBootstrapFuncArr[2]}`;
}
export default function (moduleId, options) {
options = options || {};
var sources = {
main: __webpack_modules__
};
var requiredModules = options.all ? { main: Object.keys(sources.main) } : getRequiredModules(sources, moduleId);
var src = '';
Object.keys(requiredModules).filter((m) => m !== 'main').forEach((module) => {
var entryModule = 0;
while (requiredModules[module][entryModule]) {
entryModule++;
}
requiredModules[module].push(entryModule);
sources[module][entryModule] = '(function(module, exports, __webpack_require__) { module.exports = __webpack_require__; })';
src = src + `var ${module} = (${getWebpackString(requiredModules, sources, entryModule, modules)})();\n`;
});
src = src + `new ((${getWebpackString(requiredModules, sources, moduleId, 'main')})())(self);`;
var blob = new window.Blob([src], {
type: 'text/javascript'
});
var URL = window.URL || window.webkitURL || window.mozURL || window.msURL;
var workerUrl = URL.createObjectURL(blob);
var worker = new window.Worker(workerUrl);
worker.objectURL = workerUrl;
return worker;
}

2
node_modules/hls.js/src/empty.js generated vendored
View file

@ -1,3 +1,3 @@
// This file is inserted as a shim for modules which we do not want to include into the distro.
// This replacement is done in the "resolve" section of the webpack config.
// This replacement is done in the "alias" plugin of the rollup config.
module.exports = undefined;

8
node_modules/hls.js/src/errors.ts generated vendored
View file

@ -11,10 +11,6 @@ export enum ErrorTypes {
OTHER_ERROR = 'otherError',
}
/**
* @enum {ErrorDetails}
* @typedef {string} ErrorDetail
*/
export enum ErrorDetails {
KEY_SYSTEM_NO_KEYS = 'keySystemNoKeys',
KEY_SYSTEM_NO_ACCESS = 'keySystemNoAccess',
@ -40,6 +36,8 @@ export enum ErrorDetails {
LEVEL_LOAD_ERROR = 'levelLoadError',
// Identifier for a level load timeout - data: { url : faulty URL, response : { code: error code, text: error text }}
LEVEL_LOAD_TIMEOUT = 'levelLoadTimeOut',
// Identifier for a level parse error - data: { url : faulty URL, error: Error, reason: error message }
LEVEL_PARSING_ERROR = 'levelParsingError',
// Identifier for a level switch error - data: { level : faulty level Id, event : error description}
LEVEL_SWITCH_ERROR = 'levelSwitchError',
// Identifier for an audio track load error - data: { url : faulty URL, response : { code: error code, text: error text }}
@ -59,6 +57,8 @@ export enum ErrorDetails {
// Identifier for a fragment parsing error event - data: { id : demuxer Id, reason : parsing error description }
// will be renamed DEMUX_PARSING_ERROR and switched to MUX_ERROR in the next major release
FRAG_PARSING_ERROR = 'fragParsingError',
// Identifier for a fragment or part load skipped because of a GAP tag or attribute
FRAG_GAP = 'fragGap',
// Identifier for a remux alloc error event - data: { id : demuxer Id, frag : fragment object, bytes : nb of bytes on which allocation failed , reason : error text }
REMUX_ALLOC_ERROR = 'remuxAllocError',
// Identifier for decrypt key load error - data: { frag : fragment object, response : { code: error code, text: error text }}

7
node_modules/hls.js/src/events.ts generated vendored
View file

@ -49,10 +49,6 @@ import {
BufferFlushedData,
} from './types/events';
/**
* @readonly
* @enum {string}
*/
export enum Events {
// Fired before MediaSource is attaching to media element
MEDIA_ATTACHING = 'hlsMediaAttaching',
@ -168,6 +164,9 @@ export enum Events {
BACK_BUFFER_REACHED = 'hlsBackBufferReached',
}
/**
* Defines each Event type and payload by Event name. Used in {@link hls.js#HlsEventEmitter} to strongly type the event listener API.
*/
export interface HlsListeners {
[Events.MEDIA_ATTACHING]: (
event: Events.MEDIA_ATTACHING,

226
node_modules/hls.js/src/hls.ts generated vendored
View file

@ -1,4 +1,4 @@
import * as URLToolkit from 'url-toolkit';
import { buildAbsoluteURL } from 'url-toolkit';
import PlaylistLoader from './loader/playlist-loader';
import ID3TrackController from './controller/id3-track-controller';
import LatencyController from './controller/latency-controller';
@ -12,6 +12,7 @@ import { enableStreamingMode, hlsDefaultConfig, mergeConfig } from './config';
import { EventEmitter } from 'eventemitter3';
import { Events } from './events';
import { ErrorTypes, ErrorDetails } from './errors';
import { HdcpLevels } from './types/level';
import type { HlsEventEmitter, HlsListeners } from './events';
import type AudioTrackController from './controller/audio-track-controller';
import type AbrController from './controller/abr-controller';
@ -23,24 +24,34 @@ import type SubtitleTrackController from './controller/subtitle-track-controller
import type { ComponentAPI, NetworkComponentAPI } from './types/component-api';
import type { MediaPlaylist } from './types/media-playlist';
import type { HlsConfig } from './config';
import { HdcpLevel, HdcpLevels, Level } from './types/level';
import type { Fragment } from './loader/fragment';
import { BufferInfo } from './utils/buffer-helper';
import type { HdcpLevel, Level } from './types/level';
import type { BufferInfo } from './utils/buffer-helper';
import type AudioStreamController from './controller/audio-stream-controller';
import type BasePlaylistController from './controller/base-playlist-controller';
import type BaseStreamController from './controller/base-stream-controller';
import type ContentSteeringController from './controller/content-steering-controller';
import type ErrorController from './controller/error-controller';
import type FPSController from './controller/fps-controller';
/**
* @module Hls
* @class
* @constructor
* The `Hls` class is the core of the HLS.js library used to instantiate player instances.
* @public
*/
export default class Hls implements HlsEventEmitter {
private static defaultConfig?: HlsConfig;
private static defaultConfig: HlsConfig | undefined;
/**
* The runtime configuration used by the player. At instantiation this is combination of `hls.userConfig` merged over `Hls.DefaultConfig`.
*/
public readonly config: HlsConfig;
/**
* The configuration object provided on player instantiation.
*/
public readonly userConfig: Partial<HlsConfig>;
private coreComponents: ComponentAPI[];
private networkControllers: NetworkComponentAPI[];
private _emitter: HlsEventEmitter = new EventEmitter();
private _autoLevelCapping: number;
private _maxHdcpLevel: HdcpLevel = null;
@ -54,30 +65,38 @@ export default class Hls implements HlsEventEmitter {
private subtitleTrackController: SubtitleTrackController;
private emeController: EMEController;
private cmcdController: CMCDController;
private _media: HTMLMediaElement | null = null;
private url: string | null = null;
/**
* Get the video-dev/hls.js package version.
*/
static get version(): string {
return __VERSION__;
}
/**
* Check if the required MediaSource Extensions are available.
*/
static isSupported(): boolean {
return isSupported();
}
static get Events() {
static get Events(): typeof Events {
return Events;
}
static get ErrorTypes() {
static get ErrorTypes(): typeof ErrorTypes {
return ErrorTypes;
}
static get ErrorDetails() {
static get ErrorDetails(): typeof ErrorDetails {
return ErrorDetails;
}
/**
* Get the default configuration applied to new instances.
*/
static get DefaultConfig(): HlsConfig {
if (!Hls.defaultConfig) {
return hlsDefaultConfig;
@ -87,7 +106,7 @@ export default class Hls implements HlsEventEmitter {
}
/**
* @type {HlsConfig}
* Replace the default configuration applied to new instances.
*/
static set DefaultConfig(defaultConfig: HlsConfig) {
Hls.defaultConfig = defaultConfig;
@ -95,14 +114,12 @@ export default class Hls implements HlsEventEmitter {
/**
* Creates an instance of an HLS client that can attach to exactly one `HTMLMediaElement`.
*
* @constructs Hls
* @param {HlsConfig} config
* @param userConfig - Configuration options applied over `Hls.DefaultConfig`
*/
constructor(userConfig: Partial<HlsConfig> = {}) {
enableLogs(userConfig.debug || false, 'Hls instance');
const config = (this.config = mergeConfig(Hls.DefaultConfig, userConfig));
this.userConfig = userConfig;
enableLogs(config.debug, 'Hls instance');
this._autoLevelCapping = -1;
@ -115,8 +132,10 @@ export default class Hls implements HlsEventEmitter {
abrController: ConfigAbrController,
bufferController: ConfigBufferController,
capLevelController: ConfigCapLevelController,
errorController: ConfigErrorController,
fpsController: ConfigFpsController,
} = config;
const errorController = new ConfigErrorController(this);
const abrController = (this.abrController = new ConfigAbrController(this));
const bufferController = (this.bufferController =
new ConfigBufferController(this));
@ -127,8 +146,15 @@ export default class Hls implements HlsEventEmitter {
const playListLoader = new PlaylistLoader(this);
const id3TrackController = new ID3TrackController(this);
// network controllers
const levelController = (this.levelController = new LevelController(this));
const ConfigContentSteeringController = config.contentSteeringController;
// ConentSteeringController is defined before LevelController to receive Multivariant Playlist events first
const contentSteering = ConfigContentSteeringController
? new ConfigContentSteeringController(this)
: null;
const levelController = (this.levelController = new LevelController(
this,
contentSteering
));
// FragmentTracker must be defined before StreamController because the order of event handling is important
const fragmentTracker = new FragmentTracker(this);
const keyLoader = new KeyLoader(this.config);
@ -148,6 +174,9 @@ export default class Hls implements HlsEventEmitter {
levelController,
streamController,
];
if (contentSteering) {
networkControllers.splice(1, 0, contentSteering);
}
this.networkControllers = networkControllers;
const coreComponents: ComponentAPI[] = [
@ -195,6 +224,14 @@ export default class Hls implements HlsEventEmitter {
);
this.coreComponents = coreComponents;
// Error controller handles errors before and after all other controllers
// This listener will be invoked after all other controllers error listeners
networkControllers.push(errorController);
const onErrorOut = errorController.onErrorOut;
if (typeof onErrorOut === 'function') {
this.on(Events.ERROR, onErrorOut, errorController);
}
}
createController(ControllerClass, components) {
@ -300,11 +337,15 @@ export default class Hls implements HlsEventEmitter {
this.coreComponents.forEach((component) => component.destroy());
this.coreComponents.length = 0;
// Remove any references that could be held in config options or callbacks
const config = this.config;
config.xhrSetup = config.fetchSetup = undefined;
// @ts-ignore
this.userConfig = null;
}
/**
* Attaches Hls.js to a media element
* @param {HTMLMediaElement} media
*/
attachMedia(media: HTMLMediaElement) {
logger.log('attachMedia');
@ -323,13 +364,12 @@ export default class Hls implements HlsEventEmitter {
/**
* Set the source URL. Can be relative or absolute.
* @param {string} url
*/
loadSource(url: string) {
this.stopLoad();
const media = this.media;
const loadedSource = this.url;
const loadingSource = (this.url = URLToolkit.buildAbsoluteURL(
const loadingSource = (this.url = buildAbsoluteURL(
self.location.href,
url,
{
@ -340,8 +380,7 @@ export default class Hls implements HlsEventEmitter {
if (
media &&
loadedSource &&
loadedSource !== loadingSource &&
this.bufferController.hasSourceTypes()
(loadedSource !== loadingSource || this.bufferController.hasSourceTypes())
) {
this.detachMedia();
this.attachMedia(media);
@ -354,8 +393,8 @@ export default class Hls implements HlsEventEmitter {
* Start loading data from the stream source.
* Depending on default config, client starts loading automatically when a source is set.
*
* @param {number} startPosition Set the start position to stream from
* @default -1 None (from earliest point)
* @param startPosition - Set the start position to stream from.
* Defaults to -1 (None: starts from earliest point)
*/
startLoad(startPosition: number = -1) {
logger.log(`startLoad(${startPosition})`);
@ -402,26 +441,22 @@ export default class Hls implements HlsEventEmitter {
}
/**
* @type {Level[]}
* @returns an array of levels (variants) sorted by HDCP-LEVEL, BANDWIDTH, SCORE, and RESOLUTION (height)
*/
get levels(): Array<Level> {
get levels(): Level[] {
const levels = this.levelController.levels;
return levels ? levels : [];
}
/**
* Index of quality level currently played
* @type {number}
* Index of quality level (variant) currently played
*/
get currentLevel(): number {
return this.streamController.currentLevel;
}
/**
* Set quality level index immediately .
* This will flush the current buffer to replace the quality asap.
* That means playback will interrupt at least shortly to re-buffer and re-sync eventually.
* @type {number} -1 for automatic level selection
* Set quality level index immediately. This will flush the current buffer to replace the quality asap. That means playback will interrupt at least shortly to re-buffer and re-sync eventually. Set to -1 for automatic level selection.
*/
set currentLevel(newLevel: number) {
logger.log(`set currentLevel:${newLevel}`);
@ -432,7 +467,6 @@ export default class Hls implements HlsEventEmitter {
/**
* Index of next quality level loaded as scheduled by stream controller.
* @type {number}
*/
get nextLevel(): number {
return this.streamController.nextLevel;
@ -442,7 +476,7 @@ export default class Hls implements HlsEventEmitter {
* Set quality level index for next loaded data.
* This will switch the video quality asap, without interrupting playback.
* May abort current loading of data, and flush parts of buffer (outside currently played fragment region).
* @type {number} -1 for automatic level selection
* @param newLevel - Pass -1 for automatic level selection
*/
set nextLevel(newLevel: number) {
logger.log(`set nextLevel:${newLevel}`);
@ -452,7 +486,6 @@ export default class Hls implements HlsEventEmitter {
/**
* Return the quality level of the currently or last (of none is loaded currently) segment
* @type {number}
*/
get loadLevel(): number {
return this.levelController.level;
@ -462,7 +495,7 @@ export default class Hls implements HlsEventEmitter {
* Set quality level index for next loaded data in a conservative way.
* This will switch the quality without flushing, but interrupt current loading.
* Thus the moment when the quality switch will appear in effect will only be after the already existing buffer.
* @type {number} newLevel -1 for automatic level selection
* @param newLevel - Pass -1 for automatic level selection
*/
set loadLevel(newLevel: number) {
logger.log(`set loadLevel:${newLevel}`);
@ -471,7 +504,6 @@ export default class Hls implements HlsEventEmitter {
/**
* get next quality level loaded
* @type {number}
*/
get nextLoadLevel(): number {
return this.levelController.nextLoadLevel;
@ -480,7 +512,6 @@ export default class Hls implements HlsEventEmitter {
/**
* Set quality level of next loaded segment in a fully "non-destructive" way.
* Same as `loadLevel` but will wait for next switch (until current loading is done).
* @type {number} level
*/
set nextLoadLevel(level: number) {
this.levelController.nextLoadLevel = level;
@ -489,7 +520,6 @@ export default class Hls implements HlsEventEmitter {
/**
* Return "first level": like a default level, if not set,
* falls back to index of first level referenced in manifest
* @type {number}
*/
get firstLevel(): number {
return Math.max(this.levelController.firstLevel, this.minAutoLevel);
@ -497,7 +527,6 @@ export default class Hls implements HlsEventEmitter {
/**
* Sets "first-level", see getter.
* @type {number}
*/
set firstLevel(newLevel: number) {
logger.log(`set firstLevel:${newLevel}`);
@ -509,7 +538,6 @@ export default class Hls implements HlsEventEmitter {
* if not overrided by user, first level appearing in manifest will be used as start level
* if -1 : automatic start level selection, playback will start from level matching download bandwidth
* (determined from download of first segment)
* @type {number}
*/
get startLevel(): number {
return this.levelController.startLevel;
@ -520,7 +548,6 @@ export default class Hls implements HlsEventEmitter {
* if not overrided by user, first level appearing in manifest will be used as start level
* if -1 : automatic start level selection, playback will start from level matching download bandwidth
* (determined from download of first segment)
* @type {number} newLevel
*/
set startLevel(newLevel: number) {
logger.log(`set startLevel:${newLevel}`);
@ -533,18 +560,15 @@ export default class Hls implements HlsEventEmitter {
}
/**
* Get the current setting for capLevelToPlayerSize
*
* @type {boolean}
* Whether level capping is enabled.
* Default value is set via `config.capLevelToPlayerSize`.
*/
get capLevelToPlayerSize(): boolean {
return this.config.capLevelToPlayerSize;
}
/**
* set dynamically set capLevelToPlayerSize against (`CapLevelController`)
*
* @type {boolean}
* Enables or disables level capping. If disabled after previously enabled, `nextLevelSwitch` will be immediately called.
*/
set capLevelToPlayerSize(shouldStartCapping: boolean) {
const newCapLevelToPlayerSize = !!shouldStartCapping;
@ -564,15 +588,13 @@ export default class Hls implements HlsEventEmitter {
/**
* Capping/max level value that should be used by automatic level selection algorithm (`ABRController`)
* @type {number}
*/
get autoLevelCapping(): number {
return this._autoLevelCapping;
}
/**
* get bandwidth estimate
* @type {number}
* Returns the current bandwidth estimate in bits per second, when available. Otherwise, `NaN` is returned.
*/
get bandwidthEstimate(): number {
const { bwEstimator } = this.abrController;
@ -583,9 +605,20 @@ export default class Hls implements HlsEventEmitter {
}
/**
* Capping/max level value that should be used by automatic level selection algorithm (`ABRController`)
* get time to first byte estimate
* @type {number}
*/
get ttfbEstimate(): number {
const { bwEstimator } = this.abrController;
if (!bwEstimator) {
return NaN;
}
return bwEstimator.getEstimateTTFB();
}
/**
* Capping/max level value that should be used by automatic level selection algorithm (`ABRController`)
*/
set autoLevelCapping(newLevel: number) {
if (this._autoLevelCapping !== newLevel) {
logger.log(`set autoLevelCapping:${newLevel}`);
@ -605,7 +638,6 @@ export default class Hls implements HlsEventEmitter {
/**
* True when automatic level selection enabled
* @type {boolean}
*/
get autoLevelEnabled(): boolean {
return this.levelController.manualLevel === -1;
@ -613,7 +645,6 @@ export default class Hls implements HlsEventEmitter {
/**
* Level set manually (if any)
* @type {number}
*/
get manualLevel(): number {
return this.levelController.manualLevel;
@ -621,7 +652,6 @@ export default class Hls implements HlsEventEmitter {
/**
* min level selectable in auto mode according to config.minAutoBitrate
* @type {number}
*/
get minAutoLevel(): number {
const {
@ -642,7 +672,6 @@ export default class Hls implements HlsEventEmitter {
/**
* max level selectable in auto mode according to autoLevelCapping
* @type {number}
*/
get maxAutoLevel(): number {
const { levels, autoLevelCapping, maxHdcpLevel } = this;
@ -668,7 +697,6 @@ export default class Hls implements HlsEventEmitter {
/**
* next automatically selected quality level
* @type {number}
*/
get nextAutoLevel(): number {
// ensure next auto level is between min and max auto level
@ -684,7 +712,6 @@ export default class Hls implements HlsEventEmitter {
* in case of load error on level N, hls.js can set nextAutoLevel to N-1 for example)
* forced value is valid for one fragment. upon successful frag loading at forced level,
* this value will be resetted to -1 by ABR controller.
* @type {number}
*/
set nextAutoLevel(nextLevel: number) {
this.abrController.nextAutoLevel = Math.max(this.minAutoLevel, nextLevel);
@ -692,7 +719,6 @@ export default class Hls implements HlsEventEmitter {
/**
* get the datetime value relative to media.currentTime for the active level Program Date Time if present
* @type {Date}
*/
public get playingDate(): Date | null {
return this.streamController.currentProgramDateTime;
@ -703,7 +729,7 @@ export default class Hls implements HlsEventEmitter {
}
/**
* @type {AudioTrack[]}
* Get the list of selectable audio tracks
*/
get audioTracks(): Array<MediaPlaylist> {
const audioTrackController = this.audioTrackController;
@ -712,7 +738,6 @@ export default class Hls implements HlsEventEmitter {
/**
* index of the selected audio track (index in audio track lists)
* @type {number}
*/
get audioTrack(): number {
const audioTrackController = this.audioTrackController;
@ -721,7 +746,6 @@ export default class Hls implements HlsEventEmitter {
/**
* selects an audio track, based on its index in audio track lists
* @type {number}
*/
set audioTrack(audioTrackId: number) {
const audioTrackController = this.audioTrackController;
@ -732,7 +756,6 @@ export default class Hls implements HlsEventEmitter {
/**
* get alternate subtitle tracks list from playlist
* @type {MediaPlaylist[]}
*/
get subtitleTracks(): Array<MediaPlaylist> {
const subtitleTrackController = this.subtitleTrackController;
@ -743,7 +766,6 @@ export default class Hls implements HlsEventEmitter {
/**
* index of the selected subtitle track (index in subtitle track lists)
* @type {number}
*/
get subtitleTrack(): number {
const subtitleTrackController = this.subtitleTrackController;
@ -756,7 +778,6 @@ export default class Hls implements HlsEventEmitter {
/**
* select an subtitle track, based on its index in subtitle track lists
* @type {number}
*/
set subtitleTrack(subtitleTrackId: number) {
const subtitleTrackController = this.subtitleTrackController;
@ -766,7 +787,7 @@ export default class Hls implements HlsEventEmitter {
}
/**
* @type {boolean}
* Whether subtitle display is enabled or not
*/
get subtitleDisplay(): boolean {
const subtitleTrackController = this.subtitleTrackController;
@ -777,7 +798,6 @@ export default class Hls implements HlsEventEmitter {
/**
* Enable/disable subtitle display rendering
* @type {boolean}
*/
set subtitleDisplay(value: boolean) {
const subtitleTrackController = this.subtitleTrackController;
@ -788,42 +808,38 @@ export default class Hls implements HlsEventEmitter {
/**
* get mode for Low-Latency HLS loading
* @type {boolean}
*/
get lowLatencyMode() {
get lowLatencyMode(): boolean {
return this.config.lowLatencyMode;
}
/**
* Enable/disable Low-Latency HLS part playlist and segment loading, and start live streams at playlist PART-HOLD-BACK rather than HOLD-BACK.
* @type {boolean}
*/
set lowLatencyMode(mode: boolean) {
this.config.lowLatencyMode = mode;
}
/**
* position (in seconds) of live sync point (ie edge of live position minus safety delay defined by ```hls.config.liveSyncDuration```)
* @type {number}
* Position (in seconds) of live sync point (ie edge of live position minus safety delay defined by ```hls.config.liveSyncDuration```)
* @returns null prior to loading live Playlist
*/
get liveSyncPosition(): number | null {
return this.latencyController.liveSyncPosition;
}
/**
* estimated position (in seconds) of live edge (ie edge of live playlist plus time sync playlist advanced)
* returns 0 before first playlist is loaded
* @type {number}
* Estimated position (in seconds) of live edge (ie edge of live playlist plus time sync playlist advanced)
* @returns 0 before first playlist is loaded
*/
get latency() {
get latency(): number {
return this.latencyController.latency;
}
/**
* maximum distance from the edge before the player seeks forward to ```hls.liveSyncPosition```
* configured using ```liveMaxLatencyDurationCount``` (multiple of target duration) or ```liveMaxLatencyDuration```
* returns 0 before first playlist is loaded
* @type {number}
* @returns 0 before first playlist is loaded
*/
get maxLatency(): number {
return this.latencyController.maxLatency;
@ -831,7 +847,6 @@ export default class Hls implements HlsEventEmitter {
/**
* target distance from the edge as calculated by the latency controller
* @type {number}
*/
get targetLatency(): number | null {
return this.latencyController.targetLatency;
@ -839,7 +854,6 @@ export default class Hls implements HlsEventEmitter {
/**
* the rate at which the edge of the current live playlist is advancing or 1 if there is none
* @type {number}
*/
get drift(): number | null {
return this.latencyController.drift;
@ -847,7 +861,6 @@ export default class Hls implements HlsEventEmitter {
/**
* set to true when startLoad is called before MANIFEST_PARSED event
* @type {boolean}
*/
get forceStartLoad(): boolean {
return this.streamController.forceStartLoad;
@ -863,29 +876,57 @@ export type {
HlsListeners,
HlsEventEmitter,
HlsConfig,
Fragment,
BufferInfo,
HdcpLevels,
HdcpLevel,
AbrController,
AudioStreamController,
AudioTrackController,
BasePlaylistController,
BaseStreamController,
BufferController,
CapLevelController,
CMCDController,
ContentSteeringController,
EMEController,
ErrorController,
FPSController,
SubtitleTrackController,
};
export type {
ComponentAPI,
AbrComponentAPI,
NetworkComponentAPI,
} from './types/component-api';
export type {
ABRControllerConfig,
BufferControllerConfig,
CapLevelControllerConfig,
CMCDControllerConfig,
EMEControllerConfig,
DRMSystemsConfiguration,
DRMSystemOptions,
FPSControllerConfig,
FragmentLoaderConfig,
FragmentLoaderConstructor,
HlsLoadPolicies,
LevelControllerConfig,
LoaderConfig,
LoadPolicy,
MP4RemuxerConfig,
PlaylistLoaderConfig,
PlaylistLoaderConstructor,
RetryConfig,
StreamControllerConfig,
LatencyControllerConfig,
MetadataControllerConfig,
TimelineControllerConfig,
TSDemuxerConfig,
} from './config';
export type { MediaKeySessionContext } from './controller/eme-controller';
export type { ILogger } from './utils/logger';
export type { SubtitleStreamController } from './controller/subtitle-stream-controller';
export type { TimelineController } from './controller/timeline-controller';
export type { CuesInterface } from './utils/cues';
export type {
MediaKeyFunc,
@ -903,17 +944,17 @@ export type {
UserdataSample,
} from './types/demuxer';
export type {
HdcpLevel,
HdcpLevels,
HlsSkip,
HlsUrlParameters,
LevelAttributes,
LevelParsed,
VariableMap,
} from './types/level';
export type {
PlaylistLevelType,
HlsChunkPerformanceTiming,
HlsPerformanceTiming,
HlsProgressivePerformanceTiming,
PlaylistContextType,
PlaylistLoaderContext,
FragmentLoaderContext,
@ -928,9 +969,9 @@ export type {
LoaderOnError,
LoaderOnSuccess,
LoaderOnTimeout,
HlsProgressivePerformanceTiming,
} from './types/loader';
export type {
MediaAttributes,
MediaPlaylistType,
MainPlaylistType,
AudioPlaylistType,
@ -940,6 +981,7 @@ export type { Track, TrackSet } from './types/track';
export type { ChunkMetadata } from './types/transmuxer';
export type {
BaseSegment,
Fragment,
Part,
ElementaryStreams,
ElementaryStreamTypes,
@ -985,6 +1027,7 @@ export type {
LevelSwitchingData,
LevelUpdatedData,
LiveBackBufferData,
ContentSteeringOptions,
ManifestLoadedData,
ManifestLoadingData,
ManifestParsedData,
@ -997,4 +1040,9 @@ export type {
SubtitleTracksUpdatedData,
SubtitleTrackSwitchData,
} from './types/events';
export type {
NetworkErrorAction,
ErrorActionFlags,
IErrorAction,
} from './controller/error-controller';
export type { AttrList } from './utils/attr-list';

View file

@ -5,6 +5,9 @@ function getSourceBuffer(): typeof self.SourceBuffer {
return self.SourceBuffer || (self as any).WebKitSourceBuffer;
}
/**
* @ignore
*/
export function isSupported(): boolean {
const mediaSource = getMediaSource();
if (!mediaSource) {
@ -17,7 +20,7 @@ export function isSupported(): boolean {
mediaSource.isTypeSupported('video/mp4; codecs="avc1.42E01E,mp4a.40.2"');
// if SourceBuffer is exposed ensure its API is valid
// safari and old version of Chrome doe not expose SourceBuffer globally so checking SourceBuffer.prototype is impossible
// Older browsers do not expose SourceBuffer globally so checking SourceBuffer.prototype is impossible
const sourceBufferValidAPI =
!sourceBuffer ||
(sourceBuffer.prototype &&
@ -26,6 +29,9 @@ export function isSupported(): boolean {
return !!isTypeSupported && !!sourceBufferValidAPI;
}
/**
* @ignore
*/
export function changeTypeSupported(): boolean {
const sourceBuffer = getSourceBuffer();
return (

View file

@ -1,7 +1,8 @@
import { AttrList } from '../utils/attr-list';
import { logger } from '../utils/logger';
export enum DateRangeAttribute {
// Avoid exporting const enum so that these values can be inlined
const enum DateRangeAttribute {
ID = 'ID',
CLASS = 'CLASS',
START_DATE = 'START-DATE',
@ -13,6 +14,24 @@ export enum DateRangeAttribute {
SCTE35_IN = 'SCTE35-IN',
}
export function isDateRangeCueAttribute(attrName: string): boolean {
return (
attrName !== DateRangeAttribute.ID &&
attrName !== DateRangeAttribute.CLASS &&
attrName !== DateRangeAttribute.START_DATE &&
attrName !== DateRangeAttribute.DURATION &&
attrName !== DateRangeAttribute.END_DATE &&
attrName !== DateRangeAttribute.END_ON_NEXT
);
}
export function isSCTE35Attribute(attrName: string): boolean {
return (
attrName === DateRangeAttribute.SCTE35_OUT ||
attrName === DateRangeAttribute.SCTE35_IN
);
}
export class DateRange {
public attr: AttrList;
private _startDate: Date;

View file

@ -5,9 +5,14 @@ import {
LoaderConfiguration,
FragmentLoaderContext,
} from '../types/loader';
import { getLoaderConfigWithoutReties } from '../utils/error-helper';
import type { HlsConfig } from '../config';
import type { BaseSegment, Part } from './fragment';
import type { FragLoadedData, PartsLoadedData } from '../types/events';
import type {
ErrorData,
FragLoadedData,
PartsLoadedData,
} from '../types/events';
const MIN_CHUNK_SIZE = Math.pow(2, 17); // 128kb
@ -41,16 +46,16 @@ export default class FragmentLoader {
const url = frag.url;
if (!url) {
return Promise.reject(
new LoadError(
{
type: ErrorTypes.NETWORK_ERROR,
details: ErrorDetails.FRAG_LOAD_ERROR,
fatal: false,
frag,
networkDetails: null,
},
`Fragment does not have a ${url ? 'part list' : 'url'}`
)
new LoadError({
type: ErrorTypes.NETWORK_ERROR,
details: ErrorDetails.FRAG_LOAD_ERROR,
fatal: false,
frag,
error: new Error(
`Fragment does not have a ${url ? 'part list' : 'url'}`
),
networkDetails: null,
})
);
}
this.abort();
@ -63,6 +68,10 @@ export default class FragmentLoader {
if (this.loader) {
this.loader.destroy();
}
if (frag.gap) {
reject(createGapLoadError(frag));
return;
}
const loader =
(this.loader =
frag.loader =
@ -70,11 +79,15 @@ export default class FragmentLoader {
? new FragmentILoader(config)
: (new DefaultILoader(config) as Loader<FragmentLoaderContext>));
const loaderContext = createLoaderContext(frag);
const loadPolicy = getLoaderConfigWithoutReties(
config.fragLoadPolicy.default
);
const loaderConfig: LoaderConfiguration = {
timeout: config.fragLoadingTimeOut,
loadPolicy,
timeout: loadPolicy.maxLoadTimeMs,
maxRetry: 0,
retryDelay: 0,
maxRetryDelay: config.fragLoadingMaxRetryTimeout,
maxRetryDelay: 0,
highWaterMark: frag.sn === 'initSegment' ? Infinity : MIN_CHUNK_SIZE,
};
// Assign frag stats to the loader's stats reference
@ -94,7 +107,7 @@ export default class FragmentLoader {
networkDetails,
});
},
onError: (response, context, networkDetails) => {
onError: (response, context, networkDetails, stats) => {
this.resetLoader(frag, loader);
reject(
new LoadError({
@ -102,8 +115,10 @@ export default class FragmentLoader {
details: ErrorDetails.FRAG_LOAD_ERROR,
fatal: false,
frag,
response,
response: { url, data: undefined, ...response },
error: new Error(`HTTP Error ${response.code} ${response.text}`),
networkDetails,
stats,
})
);
},
@ -115,11 +130,13 @@ export default class FragmentLoader {
details: ErrorDetails.INTERNAL_ABORTED,
fatal: false,
frag,
error: new Error('Aborted'),
networkDetails,
stats,
})
);
},
onTimeout: (response, context, networkDetails) => {
onTimeout: (stats, context, networkDetails) => {
this.resetLoader(frag, loader);
reject(
new LoadError({
@ -127,7 +144,9 @@ export default class FragmentLoader {
details: ErrorDetails.FRAG_LOAD_TIMEOUT,
fatal: false,
frag,
error: new Error(`Timeout after ${loaderConfig.timeout}ms`),
networkDetails,
stats,
})
);
},
@ -160,6 +179,10 @@ export default class FragmentLoader {
if (this.loader) {
this.loader.destroy();
}
if (frag.gap || part.gap) {
reject(createGapLoadError(frag, part));
return;
}
const loader =
(this.loader =
frag.loader =
@ -167,11 +190,16 @@ export default class FragmentLoader {
? new FragmentILoader(config)
: (new DefaultILoader(config) as Loader<FragmentLoaderContext>));
const loaderContext = createLoaderContext(frag, part);
// Should we define another load policy for parts?
const loadPolicy = getLoaderConfigWithoutReties(
config.fragLoadPolicy.default
);
const loaderConfig: LoaderConfiguration = {
timeout: config.fragLoadingTimeOut,
loadPolicy,
timeout: loadPolicy.maxLoadTimeMs,
maxRetry: 0,
retryDelay: 0,
maxRetryDelay: config.fragLoadingMaxRetryTimeout,
maxRetryDelay: 0,
highWaterMark: MIN_CHUNK_SIZE,
};
// Assign part stats to the loader's stats reference
@ -189,7 +217,7 @@ export default class FragmentLoader {
onProgress(partLoadedData);
resolve(partLoadedData);
},
onError: (response, context, networkDetails) => {
onError: (response, context, networkDetails, stats) => {
this.resetLoader(frag, loader);
reject(
new LoadError({
@ -198,8 +226,14 @@ export default class FragmentLoader {
fatal: false,
frag,
part,
response,
response: {
url: loaderContext.url,
data: undefined,
...response,
},
error: new Error(`HTTP Error ${response.code} ${response.text}`),
networkDetails,
stats,
})
);
},
@ -213,11 +247,13 @@ export default class FragmentLoader {
fatal: false,
frag,
part,
error: new Error('Aborted'),
networkDetails,
stats,
})
);
},
onTimeout: (response, context, networkDetails) => {
onTimeout: (stats, context, networkDetails) => {
this.resetLoader(frag, loader);
reject(
new LoadError({
@ -226,7 +262,9 @@ export default class FragmentLoader {
fatal: false,
frag,
part,
error: new Error(`Timeout after ${loaderConfig.timeout}ms`),
networkDetails,
stats,
})
);
},
@ -312,25 +350,41 @@ function createLoaderContext(
return loaderContext;
}
function createGapLoadError(frag: Fragment, part?: Part): LoadError {
const error = new Error(`GAP ${frag.gap ? 'tag' : 'attribute'} found`);
const errorData: FragLoadFailResult = {
type: ErrorTypes.MEDIA_ERROR,
details: ErrorDetails.FRAG_GAP,
fatal: false,
frag,
error,
networkDetails: null,
};
if (part) {
errorData.part = part;
}
(part ? part : frag).stats.aborted = true;
return new LoadError(errorData);
}
export class LoadError extends Error {
public readonly data: FragLoadFailResult;
constructor(data: FragLoadFailResult, ...params) {
super(...params);
constructor(data: FragLoadFailResult) {
super(data.error.message);
this.data = data;
}
}
export interface FragLoadFailResult {
type: string;
details: string;
fatal: boolean;
export interface FragLoadFailResult extends ErrorData {
frag: Fragment;
part?: Part;
response?: {
data: any;
// error status code
code: number;
// error description
text: string;
url: string;
};
networkDetails: any;
}

View file

@ -10,7 +10,7 @@ import type {
} from '../types/loader';
import type { KeySystemFormats } from '../utils/mediakeys-helper';
export enum ElementaryStreamTypes {
export const enum ElementaryStreamTypes {
AUDIO = 'audio',
VIDEO = 'video',
AUDIOVIDEO = 'audiovideo',
@ -91,6 +91,9 @@ export class BaseSegment {
}
}
/**
* Object representing parsed data from an HLS Segment. Found in {@link hls.js#LevelDetails.fragments}.
*/
export class Fragment extends BaseSegment {
private _decryptdata: LevelKey | null = null;
@ -120,8 +123,6 @@ export class Fragment extends BaseSegment {
public startPTS?: number;
// The ending Presentation Time Stamp (PTS) of the fragment. Set after transmux complete.
public endPTS?: number;
// The latest Presentation Time Stamp (PTS) appended to the buffer.
public appendedPTS?: number;
// The starting Decode Time Stamp (DTS) of the fragment. Set after transmux complete.
public startDTS!: number;
// The ending Decode Time Stamp (DTS) of the fragment. Set after transmux complete.
@ -146,6 +147,8 @@ export class Fragment extends BaseSegment {
public initSegment: Fragment | null = null;
// Fragment is the last fragment in the media playlist
public endList?: boolean;
// Fragment is marked by an EXT-X-GAP tag indicating that it does not contain media data and should not be loaded
public gap?: boolean;
constructor(type: PlaylistLevelType, baseurl: string) {
super(baseurl);
@ -261,6 +264,9 @@ export class Fragment extends BaseSegment {
}
}
/**
* Object representing parsed data from an HLS Partial Segment. Found in {@link hls.js#LevelDetails.partList}.
*/
export class Part extends BaseSegment {
public readonly fragOffset: number = 0;
public readonly duration: number = 0;

View file

@ -6,9 +6,10 @@ import {
LoaderCallbacks,
Loader,
KeyLoaderContext,
PlaylistLevelType,
} from '../types/loader';
import { LoadError } from './fragment-loader';
import type { HlsConfig } from '../hls';
import type { HlsConfig } from '../config';
import type { Fragment } from '../loader/fragment';
import type { ComponentAPI } from '../types/component-api';
import type { KeyLoadedData } from '../types/events';
@ -32,10 +33,13 @@ export default class KeyLoader implements ComponentAPI {
this.config = config;
}
abort() {
abort(type?: PlaylistLevelType) {
for (const uri in this.keyUriToKeyInfo) {
const loader = this.keyUriToKeyInfo[uri].loader;
if (loader) {
if (type && type !== loader.context.frag.type) {
return;
}
loader.abort();
}
}
@ -68,14 +72,17 @@ export default class KeyLoader implements ComponentAPI {
createKeyLoadError(
frag: Fragment,
details: ErrorDetails = ErrorDetails.KEY_LOAD_ERROR,
error: Error,
networkDetails?: any,
message?: string
response?: { url: string; data: undefined; code: number; text: string }
): LoadError {
return new LoadError({
type: ErrorTypes.NETWORK_ERROR,
details,
fatal: false,
frag,
response,
error,
networkDetails,
});
}
@ -89,7 +96,10 @@ export default class KeyLoader implements ComponentAPI {
const { sn, cc } = loadingFrag;
for (let i = 0; i < encryptedFragments.length; i++) {
const frag = encryptedFragments[i];
if (cc <= frag.cc && (sn === 'initSegment' || sn < frag.sn)) {
if (
cc <= frag.cc &&
(sn === 'initSegment' || frag.sn === 'initSegment' || sn < frag.sn)
) {
this.emeController
.selectKeySystemFormat(frag)
.then((keySystemFormat) => {
@ -123,16 +133,13 @@ export default class KeyLoader implements ComponentAPI {
}
const decryptdata = frag.decryptdata;
if (!decryptdata) {
const errorMessage = keySystemFormat
? `Expected frag.decryptdata to be defined after setting format ${keySystemFormat}`
: 'Missing decryption data on fragment in onKeyLoading';
const error = new Error(
keySystemFormat
? `Expected frag.decryptdata to be defined after setting format ${keySystemFormat}`
: 'Missing decryption data on fragment in onKeyLoading'
);
return Promise.reject(
this.createKeyLoadError(
frag,
ErrorDetails.KEY_LOAD_ERROR,
null,
errorMessage
)
this.createKeyLoadError(frag, ErrorDetails.KEY_LOAD_ERROR, error)
);
}
const uri = decryptdata.uri;
@ -141,8 +148,7 @@ export default class KeyLoader implements ComponentAPI {
this.createKeyLoadError(
frag,
ErrorDetails.KEY_LOAD_ERROR,
null,
`Invalid key URI: "${uri}"`
new Error(`Invalid key URI: "${uri}"`)
)
);
}
@ -159,7 +165,11 @@ export default class KeyLoader implements ComponentAPI {
case 'status-pending':
case 'usable':
case 'usable-in-future':
return keyInfo.keyLoadPromise;
return keyInfo.keyLoadPromise.then((keyLoadedData) => {
// Return the correct fragment with updated decryptdata key and loaded keyInfo
decryptdata.key = keyLoadedData.keyInfo.decryptdata.key;
return { frag, keyInfo };
});
}
// If we have a key session and status and it is not pending or usable, continue
// This will go back to the eme-controller for expired keys to get a new keyLoadPromise
@ -190,8 +200,9 @@ export default class KeyLoader implements ComponentAPI {
this.createKeyLoadError(
frag,
ErrorDetails.KEY_LOAD_ERROR,
null,
`Key supplied with unsupported METHOD: "${decryptdata.method}"`
new Error(
`Key supplied with unsupported METHOD: "${decryptdata.method}"`
)
)
);
}
@ -235,12 +246,13 @@ export default class KeyLoader implements ComponentAPI {
// maxRetry is 0 so that instead of retrying the same key on the same variant multiple times,
// key-loader will trigger an error and rely on stream-controller to handle retry logic.
// this will also align retry logic with fragment-loader
const loadPolicy = config.keyLoadPolicy.default;
const loaderConfig: LoaderConfiguration = {
timeout: config.fragLoadingTimeOut,
loadPolicy,
timeout: loadPolicy.maxLoadTimeMs,
maxRetry: 0,
retryDelay: config.fragLoadingRetryDelay,
maxRetryDelay: config.fragLoadingMaxRetryTimeout,
highWaterMark: 0,
retryDelay: 0,
maxRetryDelay: 0,
};
const loaderCallbacks: LoaderCallbacks<KeyLoaderContext> = {
@ -256,8 +268,8 @@ export default class KeyLoader implements ComponentAPI {
this.createKeyLoadError(
frag,
ErrorDetails.KEY_LOAD_ERROR,
networkDetails,
'after key load, decryptdata unset or changed'
new Error('after key load, decryptdata unset or changed'),
networkDetails
)
);
}
@ -273,16 +285,21 @@ export default class KeyLoader implements ComponentAPI {
},
onError: (
error: { code: number; text: string },
response: { code: number; text: string },
context: KeyLoaderContext,
networkDetails: any
networkDetails: any,
stats: LoaderStats
) => {
this.resetLoader(context);
reject(
this.createKeyLoadError(
frag,
ErrorDetails.KEY_LOAD_ERROR,
networkDetails
new Error(
`HTTP Error ${response.code} loading key ${response.text}`
),
networkDetails,
{ url: loaderContext.url, data: undefined, ...response }
)
);
},
@ -297,6 +314,7 @@ export default class KeyLoader implements ComponentAPI {
this.createKeyLoadError(
frag,
ErrorDetails.KEY_LOAD_TIMEOUT,
new Error('key loading timed out'),
networkDetails
)
);
@ -312,6 +330,7 @@ export default class KeyLoader implements ComponentAPI {
this.createKeyLoadError(
frag,
ErrorDetails.INTERNAL_ABORTED,
new Error('key loading aborted'),
networkDetails
)
);

View file

@ -2,9 +2,13 @@ import { Part } from './fragment';
import type { Fragment } from './fragment';
import type { AttrList } from '../utils/attr-list';
import type { DateRange } from './date-range';
import type { VariableMap } from '../types/level';
const DEFAULT_TARGET_DURATION = 10;
/**
* Object representing parsed data from an HLS Media Playlist. Found in {@link hls.js#Level.details}.
*/
export class LevelDetails {
public PTSKnown: boolean = false;
public alignedSliding: boolean = false;
@ -48,6 +52,9 @@ export class LevelDetails {
public driftStart: number = 0;
public driftEnd: number = 0;
public encryptedFragments: Fragment[];
public playlistParsingError: Error | null = null;
public variableList: VariableMap | null = null;
public hasVariableRefs = false;
constructor(baseUrl) {
this.fragments = [];

View file

@ -60,22 +60,24 @@ export class LevelKey implements DecryptData {
if (this.method === 'AES-128' || this.method === 'NONE') {
return true;
}
switch (this.keyFormat) {
case 'identity':
// Maintain support for clear SAMPLE-AES with MPEG-3 TS
return this.method === 'SAMPLE-AES';
case KeySystemFormats.FAIRPLAY:
case KeySystemFormats.WIDEVINE:
case KeySystemFormats.PLAYREADY:
case KeySystemFormats.CLEARKEY:
return (
[
'ISO-23001-7',
'SAMPLE-AES',
'SAMPLE-AES-CENC',
'SAMPLE-AES-CTR',
].indexOf(this.method) !== -1
);
if (this.keyFormat === 'identity') {
// Maintain support for clear SAMPLE-AES with MPEG-3 TS
return this.method === 'SAMPLE-AES';
} else if (__USE_EME_DRM__) {
switch (this.keyFormat) {
case KeySystemFormats.FAIRPLAY:
case KeySystemFormats.WIDEVINE:
case KeySystemFormats.PLAYREADY:
case KeySystemFormats.CLEARKEY:
return (
[
'ISO-23001-7',
'SAMPLE-AES',
'SAMPLE-AES-CENC',
'SAMPLE-AES-CTR',
].indexOf(this.method) !== -1
);
}
}
}
return false;
@ -110,6 +112,10 @@ export class LevelKey implements DecryptData {
return decryptdata;
}
if (!__USE_EME_DRM__) {
return this;
}
// Initialize keyId if possible
const keyBytes = convertDataUriToArrayBytes(this.uri);
if (keyBytes) {

View file

@ -5,29 +5,50 @@ import { LevelDetails } from './level-details';
import { LevelKey } from './level-key';
import { AttrList } from '../utils/attr-list';
import { logger } from '../utils/logger';
import type { CodecType } from '../utils/codecs';
import {
addVariableDefinition,
hasVariableReferences,
importVariableDefinition,
substituteVariables,
substituteVariablesInAttributes,
} from '../utils/variable-substitution';
import { isCodecType } from '../utils/codecs';
import type { CodecType } from '../utils/codecs';
import type {
MediaPlaylist,
AudioGroup,
MediaPlaylistType,
MediaAttributes,
} from '../types/media-playlist';
import type { PlaylistLevelType } from '../types/loader';
import type { LevelAttributes, LevelParsed } from '../types/level';
import type { LevelAttributes, LevelParsed, VariableMap } from '../types/level';
import type { ContentSteeringOptions } from '../types/events';
type M3U8ParserFragments = Array<Fragment | null>;
type ParsedMultiVariantPlaylist = {
export type ParsedMultivariantPlaylist = {
contentSteering: ContentSteeringOptions | null;
levels: LevelParsed[];
playlistParsingError: Error | null;
sessionData: Record<string, AttrList> | null;
sessionKeys: LevelKey[] | null;
startTimeOffset: number | null;
variableList: VariableMap | null;
hasVariableRefs: boolean;
};
type ParsedMultivariantMediaOptions = {
AUDIO?: MediaPlaylist[];
SUBTITLES?: MediaPlaylist[];
'CLOSED-CAPTIONS'?: MediaPlaylist[];
};
// https://regex101.com is your friend
const MASTER_PLAYLIST_REGEX =
/#EXT-X-STREAM-INF:([^\r\n]*)(?:[\r\n](?:#[^\r\n]*)?)*([^\r\n]+)|#EXT-X-SESSION-DATA:([^\r\n]*)[\r\n]+|#EXT-X-SESSION-KEY:([^\n\r]*)[\r\n]+/g;
/#EXT-X-STREAM-INF:([^\r\n]*)(?:[\r\n](?:#[^\r\n]*)?)*([^\r\n]+)|#EXT-X-(SESSION-DATA|SESSION-KEY|DEFINE|CONTENT-STEERING|START):([^\r\n]*)[\r\n]+/g;
const MASTER_PLAYLIST_MEDIA_REGEX = /#EXT-X-MEDIA:(.*)/g;
const IS_MEDIA_PLAYLIST = /^#EXT(?:INF|-X-TARGETDURATION):/m; // Handle empty Media Playlist (first EXTINF not signaled, but TARGETDURATION present)
const LEVEL_PLAYLIST_REGEX_FAST = new RegExp(
[
/#EXTINF:\s*(\d*(?:\.\d+)?)(?:,(.*)\s+)?/.source, // duration (#EXTINF:<duration>,<title>), group 1 => duration, group 2 => title
@ -42,7 +63,7 @@ const LEVEL_PLAYLIST_REGEX_FAST = new RegExp(
const LEVEL_PLAYLIST_REGEX_SLOW = new RegExp(
[
/#(EXTM3U)/.source,
/#EXT-X-(DATERANGE|KEY|MAP|PART|PART-INF|PLAYLIST-TYPE|PRELOAD-HINT|RENDITION-REPORT|SERVER-CONTROL|SKIP|START):(.+)/
/#EXT-X-(DATERANGE|DEFINE|KEY|MAP|PART|PART-INF|PLAYLIST-TYPE|PRELOAD-HINT|RENDITION-REPORT|SERVER-CONTROL|SKIP|START):(.+)/
.source,
/#EXT-X-(BITRATE|DISCONTINUITY-SEQUENCE|MEDIA-SEQUENCE|TARGETDURATION|VERSION): *(\d+)/
.source,
@ -81,30 +102,60 @@ export default class M3U8Parser {
return buildAbsoluteURL(baseUrl, url, { alwaysNormalize: true });
}
static isMediaPlaylist(str: string): boolean {
return IS_MEDIA_PLAYLIST.test(str);
}
static parseMasterPlaylist(
string: string,
baseurl: string
): ParsedMultiVariantPlaylist {
const levels: LevelParsed[] = [];
): ParsedMultivariantPlaylist {
const hasVariableRefs = __USE_VARIABLE_SUBSTITUTION__
? hasVariableReferences(string)
: false;
const parsed: ParsedMultivariantPlaylist = {
contentSteering: null,
levels: [],
playlistParsingError: null,
sessionData: null,
sessionKeys: null,
startTimeOffset: null,
variableList: null,
hasVariableRefs,
};
const levelsWithKnownCodecs: LevelParsed[] = [];
const sessionData: Record<string, AttrList> = {};
const sessionKeys: LevelKey[] = [];
let hasSessionData = false;
MASTER_PLAYLIST_REGEX.lastIndex = 0;
let result: RegExpExecArray | null;
while ((result = MASTER_PLAYLIST_REGEX.exec(string)) != null) {
if (result[1]) {
// '#EXT-X-STREAM-INF' is found, parse level tag in group 1
const attrs = new AttrList(result[1]);
const attrs = new AttrList(result[1]) as LevelAttributes;
if (__USE_VARIABLE_SUBSTITUTION__) {
substituteVariablesInAttributes(parsed, attrs, [
'CODECS',
'SUPPLEMENTAL-CODECS',
'ALLOWED-CPC',
'PATHWAY-ID',
'STABLE-VARIANT-ID',
'AUDIO',
'VIDEO',
'SUBTITLES',
'CLOSED-CAPTIONS',
'NAME',
]);
}
const uri = __USE_VARIABLE_SUBSTITUTION__
? substituteVariables(parsed, result[2])
: result[2];
const level: LevelParsed = {
attrs,
bitrate:
attrs.decimalInteger('AVERAGE-BANDWIDTH') ||
attrs.decimalInteger('BANDWIDTH'),
name: attrs.NAME,
url: M3U8Parser.resolve(result[2], baseurl),
url: M3U8Parser.resolve(uri, baseurl),
};
const resolution = attrs.decimalResolution('RESOLUTION');
@ -114,7 +165,7 @@ export default class M3U8Parser {
}
setCodecs(
(attrs.CODECS || '').split(/[ ,]+/).filter((c) => c),
((attrs.CODECS as string) || '').split(/[ ,]+/).filter((c) => c),
level
);
@ -126,57 +177,151 @@ export default class M3U8Parser {
levelsWithKnownCodecs.push(level);
}
levels.push(level);
parsed.levels.push(level);
} else if (result[3]) {
// '#EXT-X-SESSION-DATA' is found, parse session data in group 3
const sessionAttrs = new AttrList(result[3]);
if (sessionAttrs['DATA-ID']) {
hasSessionData = true;
sessionData[sessionAttrs['DATA-ID']] = sessionAttrs;
}
} else if (result[4]) {
// '#EXT-X-SESSION-KEY' is found
const keyTag = result[4];
const sessionKey = parseKey(keyTag, baseurl);
if (sessionKey.encrypted && sessionKey.isSupported()) {
sessionKeys.push(sessionKey);
} else {
logger.warn(
`[Keys] Ignoring invalid EXT-X-SESSION-KEY tag: "${keyTag}"`
);
const tag = result[3];
const attributes = result[4];
switch (tag) {
case 'SESSION-DATA': {
// #EXT-X-SESSION-DATA
const sessionAttrs = new AttrList(attributes);
if (__USE_VARIABLE_SUBSTITUTION__) {
substituteVariablesInAttributes(parsed, sessionAttrs, [
'DATA-ID',
'LANGUAGE',
'VALUE',
'URI',
]);
}
const dataId = sessionAttrs['DATA-ID'];
if (dataId) {
if (parsed.sessionData === null) {
parsed.sessionData = {};
}
parsed.sessionData[dataId] = sessionAttrs;
}
break;
}
case 'SESSION-KEY': {
// #EXT-X-SESSION-KEY
const sessionKey = parseKey(attributes, baseurl, parsed);
if (sessionKey.encrypted && sessionKey.isSupported()) {
if (parsed.sessionKeys === null) {
parsed.sessionKeys = [];
}
parsed.sessionKeys.push(sessionKey);
} else {
logger.warn(
`[Keys] Ignoring invalid EXT-X-SESSION-KEY tag: "${attributes}"`
);
}
break;
}
case 'DEFINE': {
// #EXT-X-DEFINE
if (__USE_VARIABLE_SUBSTITUTION__) {
const variableAttributes = new AttrList(attributes);
substituteVariablesInAttributes(parsed, variableAttributes, [
'NAME',
'VALUE',
'QUERYPARAM',
]);
addVariableDefinition(parsed, variableAttributes, baseurl);
}
break;
}
case 'CONTENT-STEERING': {
// #EXT-X-CONTENT-STEERING
const contentSteeringAttributes = new AttrList(attributes);
if (__USE_VARIABLE_SUBSTITUTION__) {
substituteVariablesInAttributes(
parsed,
contentSteeringAttributes,
['SERVER-URI', 'PATHWAY-ID']
);
}
parsed.contentSteering = {
uri: M3U8Parser.resolve(
contentSteeringAttributes['SERVER-URI'],
baseurl
),
pathwayId: contentSteeringAttributes['PATHWAY-ID'] || '.',
};
break;
}
case 'START': {
// #EXT-X-START
parsed.startTimeOffset = parseStartTimeOffset(attributes);
break;
}
default:
break;
}
}
}
// Filter out levels with unknown codecs if it does not remove all levels
const stripUnknownCodecLevels =
levelsWithKnownCodecs.length > 0 &&
levelsWithKnownCodecs.length < levels.length;
levelsWithKnownCodecs.length < parsed.levels.length;
return {
levels: stripUnknownCodecLevels ? levelsWithKnownCodecs : levels,
sessionData: hasSessionData ? sessionData : null,
sessionKeys: sessionKeys.length ? sessionKeys : null,
};
parsed.levels = stripUnknownCodecLevels
? levelsWithKnownCodecs
: parsed.levels;
if (parsed.levels.length === 0) {
parsed.playlistParsingError = new Error('no levels found in manifest');
}
return parsed;
}
static parseMasterPlaylistMedia(
string: string,
baseurl: string,
type: MediaPlaylistType,
groups: Array<AudioGroup> = []
): Array<MediaPlaylist> {
parsed: ParsedMultivariantPlaylist
): ParsedMultivariantMediaOptions {
let result: RegExpExecArray | null;
const medias: Array<MediaPlaylist> = [];
const results: ParsedMultivariantMediaOptions = {};
const levels = parsed.levels;
const groupsByType = {
AUDIO: levels.map((level: LevelParsed) => ({
id: level.attrs.AUDIO,
audioCodec: level.audioCodec,
})),
SUBTITLES: levels.map((level: LevelParsed) => ({
id: level.attrs.SUBTITLES,
textCodec: level.textCodec,
})),
'CLOSED-CAPTIONS': [],
};
let id = 0;
MASTER_PLAYLIST_MEDIA_REGEX.lastIndex = 0;
while ((result = MASTER_PLAYLIST_MEDIA_REGEX.exec(string)) !== null) {
const attrs = new AttrList(result[1]) as LevelAttributes;
if (attrs.TYPE === type) {
const attrs = new AttrList(result[1]) as MediaAttributes;
const type: MediaPlaylistType | undefined = attrs.TYPE as
| MediaPlaylistType
| undefined;
if (type) {
const groups = groupsByType[type];
const medias: MediaPlaylist[] = results[type] || [];
results[type] = medias;
if (__USE_VARIABLE_SUBSTITUTION__) {
substituteVariablesInAttributes(parsed, attrs, [
'URI',
'GROUP-ID',
'LANGUAGE',
'ASSOC-LANGUAGE',
'STABLE-RENDITION-ID',
'NAME',
'INSTREAM-ID',
'CHARACTERISTICS',
'CHANNELS',
]);
}
const media: MediaPlaylist = {
attrs,
bitrate: 0,
id: id++,
groupId: attrs['GROUP-ID'],
groupId: attrs['GROUP-ID'] || '',
instreamId: attrs['INSTREAM-ID'],
name: attrs.NAME || attrs.LANGUAGE || '',
type,
@ -187,7 +332,7 @@ export default class M3U8Parser {
url: attrs.URI ? M3U8Parser.resolve(attrs.URI, baseurl) : '',
};
if (groups.length) {
if (groups?.length) {
// If there are audio or text groups signalled in the manifest, let's look for a matching codec string for this track
// If we don't find the track signalled, lets use the first audio groups codec we have
// Acting as a best guess
@ -200,7 +345,7 @@ export default class M3U8Parser {
medias.push(media);
}
}
return medias;
return results;
}
static parseLevelPlaylist(
@ -208,7 +353,8 @@ export default class M3U8Parser {
baseurl: string,
id: number,
type: PlaylistLevelType,
levelUrlId: number
levelUrlId: number,
multivariantVariableList: VariableMap | null
): LevelDetails {
const level = new LevelDetails(baseurl);
const fragments: M3U8ParserFragments = level.fragments;
@ -228,6 +374,9 @@ export default class M3U8Parser {
LEVEL_PLAYLIST_REGEX_FAST.lastIndex = 0;
level.m3u8 = string;
level.hasVariableRefs = __USE_VARIABLE_SUBSTITUTION__
? hasVariableReferences(string)
: false;
while ((result = LEVEL_PLAYLIST_REGEX_FAST.exec(string)) !== null) {
if (createNextFrag) {
@ -266,7 +415,10 @@ export default class M3U8Parser {
frag.urlId = levelUrlId;
fragments.push(frag);
// avoid sliced strings https://github.com/video-dev/hls.js/issues/939
frag.relurl = (' ' + result[3]).slice(1);
const uri = (' ' + result[3]).slice(1);
frag.relurl = __USE_VARIABLE_SUBSTITUTION__
? substituteVariables(level, uri)
: uri;
assignProgramDateTime(frag, prevFrag);
prevFrag = frag;
totalduration += frag.duration;
@ -316,6 +468,11 @@ export default class M3U8Parser {
break;
case 'SKIP': {
const skipAttrs = new AttrList(value1);
if (__USE_VARIABLE_SUBSTITUTION__) {
substituteVariablesInAttributes(level, skipAttrs, [
'RECENTLY-REMOVED-DATERANGES',
]);
}
const skippedSegments =
skipAttrs.decimalInteger('SKIPPED-SEGMENTS');
if (Number.isFinite(skippedSegments)) {
@ -336,7 +493,7 @@ export default class M3U8Parser {
break;
}
case 'TARGETDURATION':
level.targetduration = parseFloat(value1);
level.targetduration = Math.max(parseInt(value1), 1);
break;
case 'VERSION':
level.version = parseInt(value1);
@ -356,6 +513,7 @@ export default class M3U8Parser {
frag.tagList.push(['DIS']);
break;
case 'GAP':
frag.gap = true;
frag.tagList.push([tag]);
break;
case 'BITRATE':
@ -363,6 +521,22 @@ export default class M3U8Parser {
break;
case 'DATERANGE': {
const dateRangeAttr = new AttrList(value1);
if (__USE_VARIABLE_SUBSTITUTION__) {
substituteVariablesInAttributes(level, dateRangeAttr, [
'ID',
'CLASS',
'START-DATE',
'END-DATE',
'SCTE35-CMD',
'SCTE35-OUT',
'SCTE35-IN',
]);
substituteVariablesInAttributes(
level,
dateRangeAttr,
dateRangeAttr.clientAttrs
);
}
const dateRange = new DateRange(
dateRangeAttr,
level.dateRanges[dateRangeAttr.ID]
@ -376,11 +550,33 @@ export default class M3U8Parser {
frag.tagList.push(['EXT-X-DATERANGE', value1]);
break;
}
case 'DEFINE': {
if (__USE_VARIABLE_SUBSTITUTION__) {
const variableAttributes = new AttrList(value1);
substituteVariablesInAttributes(level, variableAttributes, [
'NAME',
'VALUE',
'IMPORT',
'QUERYPARAM',
]);
if ('IMPORT' in variableAttributes) {
importVariableDefinition(
level,
variableAttributes,
multivariantVariableList
);
} else {
addVariableDefinition(level, variableAttributes, baseurl);
}
}
break;
}
case 'DISCONTINUITY-SEQUENCE':
discontinuityCounter = parseInt(value1);
break;
case 'KEY': {
const levelKey = parseKey(value1, baseurl);
const levelKey = parseKey(value1, baseurl, level);
if (levelKey.isSupported()) {
if (levelKey.method === 'NONE') {
levelkeys = undefined;
@ -398,18 +594,17 @@ export default class M3U8Parser {
}
break;
}
case 'START': {
const startAttrs = new AttrList(value1);
const startTimeOffset =
startAttrs.decimalFloatingPoint('TIME-OFFSET');
// TIME-OFFSET can be 0
if (Number.isFinite(startTimeOffset)) {
level.startTimeOffset = startTimeOffset;
}
case 'START':
level.startTimeOffset = parseStartTimeOffset(value1);
break;
}
case 'MAP': {
const mapAttrs = new AttrList(value1);
if (__USE_VARIABLE_SUBSTITUTION__) {
substituteVariablesInAttributes(level, mapAttrs, [
'BYTERANGE',
'URI',
]);
}
if (frag.duration) {
// Initial segment tag is after segment duration tag.
// #EXTINF: 6.0
@ -462,8 +657,15 @@ export default class M3U8Parser {
const previousFragmentPart =
currentPart > 0 ? partList[partList.length - 1] : undefined;
const index = currentPart++;
const partAttrs = new AttrList(value1);
if (__USE_VARIABLE_SUBSTITUTION__) {
substituteVariablesInAttributes(level, partAttrs, [
'BYTERANGE',
'URI',
]);
}
const part = new Part(
new AttrList(value1),
partAttrs,
frag,
baseurl,
index,
@ -475,11 +677,19 @@ export default class M3U8Parser {
}
case 'PRELOAD-HINT': {
const preloadHintAttrs = new AttrList(value1);
if (__USE_VARIABLE_SUBSTITUTION__) {
substituteVariablesInAttributes(level, preloadHintAttrs, ['URI']);
}
level.preloadHint = preloadHintAttrs;
break;
}
case 'RENDITION-REPORT': {
const renditionReportAttrs = new AttrList(value1);
if (__USE_VARIABLE_SUBSTITUTION__) {
substituteVariablesInAttributes(level, renditionReportAttrs, [
'URI',
]);
}
level.renditionReports = level.renditionReports || [];
level.renditionReports.push(renditionReportAttrs);
break;
@ -545,16 +755,28 @@ export default class M3U8Parser {
}
}
function parseKey(keyTag: string, baseurl: string): LevelKey {
function parseKey(
keyTagAttributes: string,
baseurl: string,
parsed: ParsedMultivariantPlaylist | LevelDetails
): LevelKey {
// https://tools.ietf.org/html/rfc8216#section-4.3.2.4
const keyAttrs = new AttrList(keyTag);
const decryptmethod = keyAttrs.enumeratedString('METHOD') ?? '';
const keyAttrs = new AttrList(keyTagAttributes);
if (__USE_VARIABLE_SUBSTITUTION__) {
substituteVariablesInAttributes(parsed, keyAttrs, [
'KEYFORMAT',
'KEYFORMATVERSIONS',
'URI',
'IV',
'URI',
]);
}
const decryptmethod = keyAttrs.METHOD ?? '';
const decrypturi = keyAttrs.URI;
const decryptiv = keyAttrs.hexadecimalInteger('IV');
const decryptkeyformatversions =
keyAttrs.enumeratedString('KEYFORMATVERSIONS');
const decryptkeyformatversions = keyAttrs.KEYFORMATVERSIONS;
// From RFC: This attribute is OPTIONAL; its absence indicates an implicit value of "identity".
const decryptkeyformat = keyAttrs.enumeratedString('KEYFORMAT') ?? 'identity';
const decryptkeyformat = keyAttrs.KEYFORMAT ?? 'identity';
if (decrypturi && keyAttrs.IV && !decryptiv) {
logger.error(`Invalid IV: ${keyAttrs.IV}`);
@ -578,6 +800,15 @@ function parseKey(keyTag: string, baseurl: string): LevelKey {
);
}
function parseStartTimeOffset(startAttributes: string): number | null {
const startAttrs = new AttrList(startAttributes);
const startTimeOffset = startAttrs.decimalFloatingPoint('TIME-OFFSET');
if (Number.isFinite(startTimeOffset)) {
return startTimeOffset;
}
return null;
}
function setCodecs(codecs: Array<string>, level: LevelParsed) {
['video', 'audio', 'text'].forEach((type: CodecType) => {
const filtered = codecs.filter((codec) => isCodecType(codec, type));

View file

@ -4,18 +4,16 @@
* Once loaded, dispatches events with parsed data-models of manifest/levels/audio/subtitle tracks.
*
* Uses loader(s) set in config to do actual internal loading of resource tasks.
*
* @module
*
*/
import { Events } from '../events';
import { ErrorDetails, ErrorTypes } from '../errors';
import { logger } from '../utils/logger';
import M3U8Parser from './m3u8-parser';
import type { LevelParsed } from '../types/level';
import type { LevelParsed, VariableMap } from '../types/level';
import type {
Loader,
LoaderCallbacks,
LoaderConfiguration,
LoaderContext,
LoaderResponse,
@ -24,15 +22,17 @@ import type {
} from '../types/loader';
import { PlaylistContextType, PlaylistLevelType } from '../types/loader';
import { LevelDetails } from './level-details';
import type Hls from '../hls';
import { AttrList } from '../utils/attr-list';
import type Hls from '../hls';
import type {
ErrorData,
LevelLoadingData,
ManifestLoadingData,
TrackLoadingData,
} from '../types/events';
import { NetworkComponentAPI } from '../types/component-api';
import type { NetworkComponentAPI } from '../types/component-api';
import type { MediaAttributes } from '../types/media-playlist';
import type { LoaderConfig, RetryConfig } from '../config';
function mapContextToLevelType(
context: PlaylistLoaderContext
@ -68,6 +68,7 @@ class PlaylistLoader implements NetworkComponentAPI {
private readonly loaders: {
[key: string]: Loader<LoaderContext>;
} = Object.create(null);
private variableList: VariableMap | null = null;
constructor(hls: Hls) {
this.hls = hls;
@ -106,18 +107,15 @@ class PlaylistLoader implements NetworkComponentAPI {
const PLoader = config.pLoader;
const Loader = config.loader;
const InternalLoader = PLoader || Loader;
const loader = new InternalLoader(config) as Loader<PlaylistLoaderContext>;
context.loader = loader;
this.loaders[context.type] = loader;
return loader;
}
private getInternalLoader(
context: PlaylistLoaderContext
): Loader<LoaderContext> {
): Loader<LoaderContext> | undefined {
return this.loaders[context.type];
}
@ -142,6 +140,7 @@ class PlaylistLoader implements NetworkComponentAPI {
}
public destroy(): void {
this.variableList = null;
this.unregisterListeners();
this.destroyInternalLoaders();
}
@ -151,9 +150,9 @@ class PlaylistLoader implements NetworkComponentAPI {
data: ManifestLoadingData
) {
const { url } = data;
this.variableList = null;
this.load({
id: null,
groupId: null,
level: 0,
responseType: 'text',
type: PlaylistContextType.MANIFEST,
@ -166,7 +165,6 @@ class PlaylistLoader implements NetworkComponentAPI {
const { id, level, url, deliveryDirectives } = data;
this.load({
id,
groupId: null,
level,
responseType: 'text',
type: PlaylistContextType.LEVEL,
@ -227,35 +225,17 @@ class PlaylistLoader implements NetworkComponentAPI {
loader.abort();
}
let maxRetry;
let timeout;
let retryDelay;
let maxRetryDelay;
// apply different configs for retries depending on
// context (manifest, level, audio/subs playlist)
switch (context.type) {
case PlaylistContextType.MANIFEST:
maxRetry = config.manifestLoadingMaxRetry;
timeout = config.manifestLoadingTimeOut;
retryDelay = config.manifestLoadingRetryDelay;
maxRetryDelay = config.manifestLoadingMaxRetryTimeout;
break;
case PlaylistContextType.LEVEL:
case PlaylistContextType.AUDIO_TRACK:
case PlaylistContextType.SUBTITLE_TRACK:
// Manage retries in Level/Track Controller
maxRetry = 0;
timeout = config.levelLoadingTimeOut;
break;
default:
maxRetry = config.levelLoadingMaxRetry;
timeout = config.levelLoadingTimeOut;
retryDelay = config.levelLoadingRetryDelay;
maxRetryDelay = config.levelLoadingMaxRetryTimeout;
break;
let loadPolicy: LoaderConfig;
if (context.type === PlaylistContextType.MANIFEST) {
loadPolicy = config.manifestLoadPolicy.default;
} else {
loadPolicy = Object.assign({}, config.playlistLoadPolicy.default, {
timeoutRetry: null,
errorRetry: null,
});
}
loader = this.createInternalLoader(context);
// Override level/track timeout for LL-HLS requests
@ -282,26 +262,84 @@ class PlaylistLoader implements NetworkComponentAPI {
const partTarget = levelDetails.partTarget;
const targetDuration = levelDetails.targetduration;
if (partTarget && targetDuration) {
timeout = Math.min(
Math.max(partTarget * 3, targetDuration * 0.8) * 1000,
timeout
);
const maxLowLatencyPlaylistRefresh =
Math.max(partTarget * 3, targetDuration * 0.8) * 1000;
loadPolicy = Object.assign({}, loadPolicy, {
maxTimeToFirstByteMs: Math.min(
maxLowLatencyPlaylistRefresh,
loadPolicy.maxTimeToFirstByteMs
),
maxLoadTimeMs: Math.min(
maxLowLatencyPlaylistRefresh,
loadPolicy.maxTimeToFirstByteMs
),
});
}
}
}
const legacyRetryCompatibility: RetryConfig | Record<string, void> =
loadPolicy.errorRetry || loadPolicy.timeoutRetry || {};
const loaderConfig: LoaderConfiguration = {
timeout,
maxRetry,
retryDelay,
maxRetryDelay,
highWaterMark: 0,
loadPolicy,
timeout: loadPolicy.maxLoadTimeMs,
maxRetry: legacyRetryCompatibility.maxNumRetry || 0,
retryDelay: legacyRetryCompatibility.retryDelayMs || 0,
maxRetryDelay: legacyRetryCompatibility.maxRetryDelayMs || 0,
};
const loaderCallbacks = {
onSuccess: this.loadsuccess.bind(this),
onError: this.loaderror.bind(this),
onTimeout: this.loadtimeout.bind(this),
const loaderCallbacks: LoaderCallbacks<PlaylistLoaderContext> = {
onSuccess: (response, stats, context, networkDetails) => {
const loader = this.getInternalLoader(context) as
| Loader<PlaylistLoaderContext>
| undefined;
this.resetInternalLoader(context.type);
const string = response.data as string;
// Validate if it is an M3U8 at all
if (string.indexOf('#EXTM3U') !== 0) {
this.handleManifestParsingError(
response,
context,
new Error('no EXTM3U delimiter'),
networkDetails || null,
stats
);
return;
}
stats.parsing.start = performance.now();
if (M3U8Parser.isMediaPlaylist(string)) {
this.handleTrackOrLevelPlaylist(
response,
stats,
context,
networkDetails || null,
loader
);
} else {
this.handleMasterPlaylist(response, stats, context, networkDetails);
}
},
onError: (response, context, networkDetails, stats) => {
this.handleNetworkError(
context,
networkDetails,
false,
response,
stats
);
},
onTimeout: (stats, context, networkDetails) => {
this.handleNetworkError(
context,
networkDetails,
true,
undefined,
stats
);
},
};
// logger.debug(`[playlist-loader]: Calling internal loader delegate for URL: ${context.url}`);
@ -309,55 +347,6 @@ class PlaylistLoader implements NetworkComponentAPI {
loader.load(context, loaderConfig, loaderCallbacks);
}
private loadsuccess(
response: LoaderResponse,
stats: LoaderStats,
context: PlaylistLoaderContext,
networkDetails: any = null
): void {
this.resetInternalLoader(context.type);
const string = response.data as string;
// Validate if it is an M3U8 at all
if (string.indexOf('#EXTM3U') !== 0) {
this.handleManifestParsingError(
response,
context,
'no EXTM3U delimiter',
networkDetails
);
return;
}
stats.parsing.start = performance.now();
// Check if chunk-list or master. handle empty chunk list case (first EXTINF not signaled, but TARGETDURATION present)
if (
string.indexOf('#EXTINF:') > 0 ||
string.indexOf('#EXT-X-TARGETDURATION:') > 0
) {
this.handleTrackOrLevelPlaylist(response, stats, context, networkDetails);
} else {
this.handleMasterPlaylist(response, stats, context, networkDetails);
}
}
private loaderror(
response: LoaderResponse,
context: PlaylistLoaderContext,
networkDetails: any = null
): void {
this.handleNetworkError(context, networkDetails, false, response);
}
private loadtimeout(
stats: LoaderStats,
context: PlaylistLoaderContext,
networkDetails: any = null
): void {
this.handleNetworkError(context, networkDetails, true);
}
private handleMasterPlaylist(
response: LoaderResponse,
stats: LoaderStats,
@ -369,48 +358,35 @@ class PlaylistLoader implements NetworkComponentAPI {
const url = getResponseUrl(response, context);
const { levels, sessionData, sessionKeys } = M3U8Parser.parseMasterPlaylist(
string,
url
);
if (!levels.length) {
const parsedResult = M3U8Parser.parseMasterPlaylist(string, url);
if (parsedResult.playlistParsingError) {
this.handleManifestParsingError(
response,
context,
'no level found in manifest',
networkDetails
parsedResult.playlistParsingError,
networkDetails,
stats
);
return;
}
// multi level playlist, parse level info
const audioGroups = levels.map((level: LevelParsed) => ({
id: level.attrs.AUDIO,
audioCodec: level.audioCodec,
}));
const {
contentSteering,
levels,
sessionData,
sessionKeys,
startTimeOffset,
variableList,
} = parsedResult;
const subtitleGroups = levels.map((level: LevelParsed) => ({
id: level.attrs.SUBTITLES,
textCodec: level.textCodec,
}));
this.variableList = variableList;
const audioTracks = M3U8Parser.parseMasterPlaylistMedia(
string,
url,
'AUDIO',
audioGroups
);
const subtitles = M3U8Parser.parseMasterPlaylistMedia(
string,
url,
'SUBTITLES',
subtitleGroups
);
const captions = M3U8Parser.parseMasterPlaylistMedia(
string,
url,
'CLOSED-CAPTIONS'
);
const {
AUDIO: audioTracks = [],
SUBTITLES: subtitles,
'CLOSED-CAPTIONS': captions,
} = M3U8Parser.parseMasterPlaylistMedia(string, url, parsedResult);
if (audioTracks.length) {
// check if we have found an audio track embedded in main playlist (audio track without URI attribute)
@ -433,11 +409,12 @@ class PlaylistLoader implements NetworkComponentAPI {
audioTracks.unshift({
type: 'main',
name: 'main',
groupId: 'main',
default: false,
autoselect: false,
forced: false,
id: -1,
attrs: new AttrList({}),
attrs: new AttrList({}) as MediaAttributes,
bitrate: 0,
url: '',
});
@ -449,11 +426,14 @@ class PlaylistLoader implements NetworkComponentAPI {
audioTracks,
subtitles,
captions,
contentSteering,
url,
stats,
networkDetails,
sessionData,
sessionKeys,
startTimeOffset,
variableList,
});
}
@ -461,35 +441,27 @@ class PlaylistLoader implements NetworkComponentAPI {
response: LoaderResponse,
stats: LoaderStats,
context: PlaylistLoaderContext,
networkDetails: any
networkDetails: any,
loader: Loader<PlaylistLoaderContext> | undefined
): void {
const hls = this.hls;
const { id, level, type } = context;
const url = getResponseUrl(response, context);
const levelUrlId = Number.isFinite(id as number) ? id : 0;
const levelId = Number.isFinite(level as number) ? level : levelUrlId;
const levelUrlId = Number.isFinite(id as number) ? (id as number) : 0;
const levelId = Number.isFinite(level as number)
? (level as number)
: levelUrlId;
const levelType = mapContextToLevelType(context);
const levelDetails: LevelDetails = M3U8Parser.parseLevelPlaylist(
response.data as string,
url,
levelId!,
levelId,
levelType,
levelUrlId!
levelUrlId,
this.variableList
);
if (!levelDetails.fragments.length) {
hls.trigger(Events.ERROR, {
type: ErrorTypes.NETWORK_ERROR,
details: ErrorDetails.LEVEL_EMPTY_ERROR,
fatal: false,
url: url,
reason: 'no fragments found in level',
level: typeof context.level === 'number' ? context.level : undefined,
});
return;
}
// We have done our first request (Manifest-type) and receive
// not a master playlist but a chunk-list (track/level)
// We fire the manifest-loaded event anyway with the parsed level-details
@ -511,6 +483,9 @@ class PlaylistLoader implements NetworkComponentAPI {
networkDetails,
sessionData: null,
sessionKeys: null,
contentSteering: null,
startTimeOffset: null,
variableList: null,
});
}
@ -520,24 +495,35 @@ class PlaylistLoader implements NetworkComponentAPI {
// extend the context with the new levelDetails property
context.levelDetails = levelDetails;
this.handlePlaylistLoaded(response, stats, context, networkDetails);
this.handlePlaylistLoaded(
levelDetails,
response,
stats,
context,
networkDetails,
loader
);
}
private handleManifestParsingError(
response: LoaderResponse,
context: PlaylistLoaderContext,
reason: string,
networkDetails: any
error: Error,
networkDetails: any,
stats: LoaderStats
): void {
this.hls.trigger(Events.ERROR, {
type: ErrorTypes.NETWORK_ERROR,
details: ErrorDetails.MANIFEST_PARSING_ERROR,
fatal: context.type === PlaylistContextType.MANIFEST,
url: response.url,
reason,
err: error,
error,
reason: error.message,
response,
context,
networkDetails,
stats,
});
}
@ -545,15 +531,24 @@ class PlaylistLoader implements NetworkComponentAPI {
context: PlaylistLoaderContext,
networkDetails: any,
timeout = false,
response?: LoaderResponse
response: { code: number; text: string } | undefined,
stats: LoaderStats
): void {
logger.warn(
`[playlist-loader]: A network ${
timeout ? 'timeout' : 'error'
} occurred while loading ${context.type} level: ${context.level} id: ${
context.id
} group-id: "${context.groupId}"`
);
let message = `A network ${
timeout
? 'timeout'
: 'error' + (response ? ' (status ' + response.code + ')' : '')
} occurred while loading ${context.type}`;
if (context.type === PlaylistContextType.LEVEL) {
message += `: ${context.level} id: ${context.id}`;
} else if (
context.type === PlaylistContextType.AUDIO_TRACK ||
context.type === PlaylistContextType.SUBTITLE_TRACK
) {
message += ` id: ${context.id} group-id: "${context.groupId}"`;
}
const error = new Error(message);
logger.warn(`[playlist-loader]: ${message}`);
let details = ErrorDetails.UNKNOWN;
let fatal = false;
@ -597,46 +592,76 @@ class PlaylistLoader implements NetworkComponentAPI {
url: context.url,
loader,
context,
error,
networkDetails,
stats,
};
if (response) {
errorData.response = response;
const url = networkDetails?.url || context.url;
errorData.response = { url, data: undefined as any, ...response };
}
this.hls.trigger(Events.ERROR, errorData);
}
private handlePlaylistLoaded(
levelDetails: LevelDetails,
response: LoaderResponse,
stats: LoaderStats,
context: PlaylistLoaderContext,
networkDetails: any
networkDetails: any,
loader: Loader<PlaylistLoaderContext> | undefined
): void {
const {
type,
level,
id,
groupId,
loader,
levelDetails,
deliveryDirectives,
} = context;
if (!levelDetails?.targetduration) {
this.handleManifestParsingError(
const hls = this.hls;
const { type, level, id, groupId, deliveryDirectives } = context;
const url = getResponseUrl(response, context);
const parent = mapContextToLevelType(context);
const levelIndex =
typeof context.level === 'number' && parent === PlaylistLevelType.MAIN
? (level as number)
: undefined;
if (!levelDetails.fragments.length) {
const error = new Error('No Segments found in Playlist');
hls.trigger(Events.ERROR, {
type: ErrorTypes.NETWORK_ERROR,
details: ErrorDetails.LEVEL_EMPTY_ERROR,
fatal: false,
url,
error,
reason: error.message,
response,
context,
'invalid target duration',
networkDetails
);
level: levelIndex,
parent,
networkDetails,
stats,
});
return;
}
if (!loader) {
if (!levelDetails.targetduration) {
levelDetails.playlistParsingError = new Error('Missing Target Duration');
}
const error = levelDetails.playlistParsingError;
if (error) {
hls.trigger(Events.ERROR, {
type: ErrorTypes.NETWORK_ERROR,
details: ErrorDetails.LEVEL_PARSING_ERROR,
fatal: false,
url,
error,
reason: error.message,
response,
context,
level: levelIndex,
parent,
networkDetails,
stats,
});
return;
}
if (levelDetails.live) {
if (levelDetails.live && loader) {
if (loader.getCacheAge) {
levelDetails.ageHeader = loader.getCacheAge() || 0;
}
@ -648,9 +673,9 @@ class PlaylistLoader implements NetworkComponentAPI {
switch (type) {
case PlaylistContextType.MANIFEST:
case PlaylistContextType.LEVEL:
this.hls.trigger(Events.LEVEL_LOADED, {
hls.trigger(Events.LEVEL_LOADED, {
details: levelDetails,
level: level || 0,
level: levelIndex || 0,
id: id || 0,
stats,
networkDetails,
@ -658,7 +683,7 @@ class PlaylistLoader implements NetworkComponentAPI {
});
break;
case PlaylistContextType.AUDIO_TRACK:
this.hls.trigger(Events.AUDIO_TRACK_LOADED, {
hls.trigger(Events.AUDIO_TRACK_LOADED, {
details: levelDetails,
id: id || 0,
groupId: groupId || '',
@ -668,7 +693,7 @@ class PlaylistLoader implements NetworkComponentAPI {
});
break;
case PlaylistContextType.SUBTITLE_TRACK:
this.hls.trigger(Events.SUBTITLE_TRACK_LOADED, {
hls.trigger(Events.SUBTITLE_TRACK_LOADED, {
details: levelDetails,
id: id || 0,
groupId: groupId || '',

View file

@ -1,58 +0,0 @@
/*
* Push the performance monitor as the last core component in hls.ts
* so that it is the last class to handle events.
*
* coreComponents.push(new PerformanceMonitor(this));
*
* TODO: Add this to the demo page or a performance test page
*/
import { Events } from '../events';
import { logger } from '../utils/logger';
import Hls from '../hls';
import type { FragBufferedData } from '../types/events';
export default class PerformanceMonitor {
private hls: Hls;
constructor(hls: Hls) {
this.hls = hls;
this.hls.on(Events.FRAG_BUFFERED, this.onFragBuffered);
}
destroy() {
this.hls.off(Events.FRAG_BUFFERED);
}
onFragBuffered(event: Events.FRAG_BUFFERED, data: FragBufferedData) {
logFragStats(data);
}
}
function logFragStats(data: FragBufferedData) {
const { frag, part } = data;
const stats = part ? part.stats : frag.stats;
const tLoad = stats.loading.end - stats.loading.start;
const tBuffer = stats.buffering.end - stats.buffering.start;
const tParse = stats.parsing.end - stats.parsing.start;
const tTotal = stats.buffering.end - stats.loading.start;
logger.log(`[performance-monitor]: Stats for fragment ${frag.sn} ${
part ? ' part ' + part.index : ''
} of level ${frag.level}:
Size: ${(stats.total / 1024).toFixed(3)} kB
Chunk Count: ${stats.chunkCount}
Request: ${stats.loading.start.toFixed(3)} ms
First Byte: ${stats.loading.first.toFixed(3)} ms
Parse Start ${stats.parsing.start.toFixed(3)} ms
Buffering Start: ${stats.buffering.start.toFixed(3)} ms
First Buffer: ${stats.buffering.first.toFixed(3)} ms
Parse End: ${stats.parsing.end.toFixed(3)} ms
Buffering End: ${stats.buffering.end.toFixed(3)} ms
Load Duration: ${tLoad.toFixed(3)} ms
Parse Duration: ${tParse.toFixed(3)} ms
Buffer Duration: ${tBuffer.toFixed(3)} ms
End-To-End Duration: ${tTotal.toFixed(3)} ms`);
}

View file

@ -389,9 +389,6 @@ class MP4 {
);
}
/**
* @param tracks... (optional) {array} the tracks associated with this movie
*/
static moov(tracks) {
let i = tracks.length;
const boxes: Uint8Array[] = [];
@ -1029,8 +1026,7 @@ class MP4 {
/**
* Generate a track box.
* @param track {object} a track definition
* @return {Uint8Array} the track box
* @param track a track definition
*/
static trak(track) {
track.duration = track.duration || 0xffffffff;

View file

@ -13,7 +13,10 @@ import {
RemuxedUserdata,
} from '../types/remuxer';
import { PlaylistLevelType } from '../types/loader';
import { toMsFromMpegTsClock } from '../utils/timescale-conversion';
import {
RationalTimestamp,
toMsFromMpegTsClock,
} from '../utils/timescale-conversion';
import type {
AudioSample,
AvcSample,
@ -39,8 +42,8 @@ export default class MP4Remuxer implements Remuxer {
private config: HlsConfig;
private typeSupported: any;
private ISGenerated: boolean = false;
private _initPTS!: number;
private _initDTS!: number;
private _initPTS: RationalTimestamp | null = null;
private _initDTS: RationalTimestamp | null = null;
private nextAvcDts: number | null = null;
private nextAudioPts: number | null = null;
private videoSampleDuration: number | null = null;
@ -71,7 +74,7 @@ export default class MP4Remuxer implements Remuxer {
destroy() {}
resetTimeStamp(defaultTimeStamp) {
resetTimeStamp(defaultTimeStamp: RationalTimestamp | null) {
logger.log('[mp4-remuxer]: initPTS & initDTS reset');
this._initPTS = this._initDTS = defaultTimeStamp;
}
@ -144,7 +147,12 @@ export default class MP4Remuxer implements Remuxer {
if (canRemuxAvc) {
if (!this.ISGenerated) {
initSegment = this.generateIS(audioTrack, videoTrack, timeOffset);
initSegment = this.generateIS(
audioTrack,
videoTrack,
timeOffset,
accurateTimeOffset
);
}
const isVideoContiguous = this.isVideoContiguous;
@ -196,7 +204,12 @@ export default class MP4Remuxer implements Remuxer {
logger.warn(
'[mp4-remuxer]: regenerate InitSegment as audio detected'
);
initSegment = this.generateIS(audioTrack, videoTrack, timeOffset);
initSegment = this.generateIS(
audioTrack,
videoTrack,
timeOffset,
accurateTimeOffset
);
}
audio = this.remuxAudio(
audioTrack,
@ -216,7 +229,12 @@ export default class MP4Remuxer implements Remuxer {
logger.warn(
'[mp4-remuxer]: regenerate InitSegment as video detected'
);
initSegment = this.generateIS(audioTrack, videoTrack, timeOffset);
initSegment = this.generateIS(
audioTrack,
videoTrack,
timeOffset,
accurateTimeOffset
);
}
video = this.remuxVideo(
videoTrack,
@ -242,7 +260,7 @@ export default class MP4Remuxer implements Remuxer {
}
// Allow ID3 and text to remux, even if more audio/video samples are required
if (this.ISGenerated) {
if (this.ISGenerated && this._initPTS && this._initDTS) {
if (id3Track.samples.length) {
id3 = flushTextTrackMetadataCueSamples(
id3Track,
@ -274,13 +292,15 @@ export default class MP4Remuxer implements Remuxer {
generateIS(
audioTrack: DemuxedAudioTrack,
videoTrack: DemuxedAvcTrack,
timeOffset
timeOffset: number,
accurateTimeOffset: boolean
): InitSegmentData | undefined {
const audioSamples = audioTrack.samples;
const videoSamples = videoTrack.samples;
const typeSupported = this.typeSupported;
const tracks: TrackSet = {};
const computePTSDTS = !Number.isFinite(this._initPTS);
const _initPTS = this._initPTS;
let computePTSDTS = !_initPTS || accurateTimeOffset;
let container = 'audio/mp4';
let initPTS: number | undefined;
let initDTS: number | undefined;
@ -322,9 +342,13 @@ export default class MP4Remuxer implements Remuxer {
};
if (computePTSDTS) {
timescale = audioTrack.inputTimeScale;
// remember first PTS of this demuxing context. for audio, PTS = DTS
initPTS = initDTS =
audioSamples[0].pts - Math.round(timescale * timeOffset);
if (!_initPTS || timescale !== _initPTS.timescale) {
// remember first PTS of this demuxing context. for audio, PTS = DTS
initPTS = initDTS =
audioSamples[0].pts - Math.round(timescale * timeOffset);
} else {
computePTSDTS = false;
}
}
}
@ -344,21 +368,33 @@ export default class MP4Remuxer implements Remuxer {
};
if (computePTSDTS) {
timescale = videoTrack.inputTimeScale;
const startPTS = this.getVideoStartPts(videoSamples);
const startOffset = Math.round(timescale * timeOffset);
initDTS = Math.min(
initDTS as number,
normalizePts(videoSamples[0].dts, startPTS) - startOffset
);
initPTS = Math.min(initPTS as number, startPTS - startOffset);
if (!_initPTS || timescale !== _initPTS.timescale) {
const startPTS = this.getVideoStartPts(videoSamples);
const startOffset = Math.round(timescale * timeOffset);
initDTS = Math.min(
initDTS as number,
normalizePts(videoSamples[0].dts, startPTS) - startOffset
);
initPTS = Math.min(initPTS as number, startPTS - startOffset);
} else {
computePTSDTS = false;
}
}
}
if (Object.keys(tracks).length) {
this.ISGenerated = true;
if (computePTSDTS) {
this._initPTS = initPTS as number;
this._initDTS = initDTS as number;
this._initPTS = {
baseTime: initPTS as number,
timescale: timescale as number,
};
this._initDTS = {
baseTime: initDTS as number,
timescale: timescale as number,
};
} else {
initPTS = timescale = undefined;
}
return {
@ -378,8 +414,8 @@ export default class MP4Remuxer implements Remuxer {
const timeScale: number = track.inputTimeScale;
const inputSamples: Array<AvcSample> = track.samples;
const outputSamples: Array<Mp4Sample> = [];
const nbSamples: number = inputSamples.length;
const initPTS: number = this._initPTS;
const nbSamples = inputSamples.length;
const initPTS = this._initPTS as RationalTimestamp;
let nextAvcDts = this.nextAvcDts;
let offset = 8;
let mp4SampleDuration = this.videoSampleDuration;
@ -401,10 +437,11 @@ export default class MP4Remuxer implements Remuxer {
// PTS is coded on 33bits, and can loop from -2^32 to 2^32
// PTSNormalize will make PTS/DTS value monotonic, we use last known DTS value as reference value
const initTime = (initPTS.baseTime * timeScale) / initPTS.timescale;
for (let i = 0; i < nbSamples; i++) {
const sample = inputSamples[i];
sample.pts = normalizePts(sample.pts - initPTS, nextAvcDts);
sample.dts = normalizePts(sample.dts - initPTS, nextAvcDts);
sample.pts = normalizePts(sample.pts - initTime, nextAvcDts);
sample.dts = normalizePts(sample.dts - initTime, nextAvcDts);
if (sample.dts < inputSamples[i > 0 ? i - 1 : i].dts) {
sortSamples = true;
}
@ -452,7 +489,7 @@ export default class MP4Remuxer implements Remuxer {
)} ms (${delta}dts) overlapping between fragments detected`
);
}
if (!foundOverlap || nextAvcDts > inputSamples[0].pts) {
if (!foundOverlap || nextAvcDts >= inputSamples[0].pts) {
firstDTS = nextAvcDts;
const firstPTS = inputSamples[0].pts - delta;
inputSamples[0].dts = firstDTS;
@ -507,6 +544,7 @@ export default class MP4Remuxer implements Remuxer {
type: ErrorTypes.MUX_ERROR,
details: ErrorDetails.REMUX_ALLOC_ERROR,
fatal: false,
error: err,
bytes: mdatSize,
reason: `fail allocating video mdat ${mdatSize}`,
});
@ -640,11 +678,6 @@ export default class MP4Remuxer implements Remuxer {
}
}
}
console.assert(
mp4SampleDuration !== null,
'mp4SampleDuration must be computed'
);
// next AVC sample DTS should be equal to last sample DTS + last sample duration (in PES timescale)
mp4SampleDuration =
stretchedLastFrame || !mp4SampleDuration
@ -674,12 +707,8 @@ export default class MP4Remuxer implements Remuxer {
nb: outputSamples.length,
dropped: track.dropped,
};
track.samples = [];
track.dropped = 0;
console.assert(mdat.length, 'MDAT length must not be zero');
return data;
}
@ -700,7 +729,7 @@ export default class MP4Remuxer implements Remuxer {
? AAC_SAMPLES_PER_FRAME
: MPEG_AUDIO_SAMPLE_PER_FRAME;
const inputSampleDuration: number = mp4SampleDuration * scaleFactor;
const initPTS: number = this._initPTS;
const initPTS = this._initPTS as RationalTimestamp;
const rawMPEG: boolean =
track.segmentCodec === 'mp3' && this.typeSupported.mpeg;
const outputSamples: Array<Mp4Sample> = [];
@ -721,6 +750,7 @@ export default class MP4Remuxer implements Remuxer {
// this helps ensuring audio continuity
// and this also avoids audio glitches/cut when switching quality, or reporting wrong duration on first audio frame
const timeOffsetMpegTS = timeOffset * inputTimeScale;
const initTime = (initPTS.baseTime * inputTimeScale) / initPTS.timescale;
this.isAudioContiguous = contiguous =
contiguous ||
((inputSamples.length &&
@ -728,14 +758,14 @@ export default class MP4Remuxer implements Remuxer {
((accurateTimeOffset &&
Math.abs(timeOffsetMpegTS - nextAudioPts) < 9000) ||
Math.abs(
normalizePts(inputSamples[0].pts - initPTS, timeOffsetMpegTS) -
normalizePts(inputSamples[0].pts - initTime, timeOffsetMpegTS) -
nextAudioPts
) <
20 * inputSampleDuration)) as boolean);
// compute normalized PTS
inputSamples.forEach(function (sample) {
sample.pts = normalizePts(sample.pts - initPTS, timeOffsetMpegTS);
sample.pts = normalizePts(sample.pts - initTime, timeOffsetMpegTS);
});
if (!contiguous || nextAudioPts < 0) {
@ -880,6 +910,7 @@ export default class MP4Remuxer implements Remuxer {
type: ErrorTypes.MUX_ERROR,
details: ErrorDetails.REMUX_ALLOC_ERROR,
fatal: false,
error: err,
bytes: mdatSize,
reason: `fail allocating audio mdat ${mdatSize}`,
});
@ -944,8 +975,6 @@ export default class MP4Remuxer implements Remuxer {
};
this.isAudioContiguous = true;
console.assert(mdat.length, 'MDAT length must not be zero');
return audioData;
}
@ -962,11 +991,13 @@ export default class MP4Remuxer implements Remuxer {
const scaleFactor: number = inputTimeScale / mp4timeScale;
const nextAudioPts: number | null = this.nextAudioPts;
// sync with video's timestamp
const initDTS = this._initDTS as RationalTimestamp;
const init90kHz = (initDTS.baseTime * 90000) / initDTS.timescale;
const startDTS: number =
(nextAudioPts !== null
? nextAudioPts
: videoData.startDTS * inputTimeScale) + this._initDTS;
const endDTS: number = videoData.endDTS * inputTimeScale + this._initDTS;
: videoData.startDTS * inputTimeScale) + init90kHz;
const endDTS: number = videoData.endDTS * inputTimeScale + init90kHz;
// one sample's duration value
const frameDuration: number = scaleFactor * AAC_SAMPLES_PER_FRAME;
// samples count of this segment's duration
@ -1032,8 +1063,8 @@ function findKeyframeIndex(samples: Array<AvcSample>): number {
export function flushTextTrackMetadataCueSamples(
track: DemuxedMetadataTrack,
timeOffset: number,
initPTS: number,
initDTS: number
initPTS: RationalTimestamp,
initDTS: RationalTimestamp
): RemuxedMetadata | undefined {
const length = track.samples.length;
if (!length) {
@ -1045,11 +1076,15 @@ export function flushTextTrackMetadataCueSamples(
// setting id3 pts, dts to relative time
// using this._initPTS and this._initDTS to calculate relative time
sample.pts =
normalizePts(sample.pts - initPTS, timeOffset * inputTimeScale) /
inputTimeScale;
normalizePts(
sample.pts - (initPTS.baseTime * inputTimeScale) / initPTS.timescale,
timeOffset * inputTimeScale
) / inputTimeScale;
sample.dts =
normalizePts(sample.dts - initDTS, timeOffset * inputTimeScale) /
inputTimeScale;
normalizePts(
sample.dts - (initDTS.baseTime * inputTimeScale) / initDTS.timescale,
timeOffset * inputTimeScale
) / inputTimeScale;
}
const samples = track.samples;
track.samples = [];
@ -1061,7 +1096,7 @@ export function flushTextTrackMetadataCueSamples(
export function flushTextTrackUserdataCueSamples(
track: DemuxedUserdataTrack,
timeOffset: number,
initPTS: number
initPTS: RationalTimestamp
): RemuxedUserdata | undefined {
const length = track.samples.length;
if (!length) {
@ -1074,8 +1109,10 @@ export function flushTextTrackUserdataCueSamples(
// setting text pts, dts to relative time
// using this._initPTS and this._initDTS to calculate relative time
sample.pts =
normalizePts(sample.pts - initPTS, timeOffset * inputTimeScale) /
inputTimeScale;
normalizePts(
sample.pts - (initPTS.baseTime * 90000) / initPTS.timescale,
timeOffset * inputTimeScale
) / inputTimeScale;
}
track.samples.sort((a, b) => a.pts - b.pts);
const samples = track.samples;

View file

@ -29,19 +29,20 @@ import type {
PassthroughTrack,
} from '../types/demuxer';
import type { DecryptData } from '../loader/level-key';
import type { RationalTimestamp } from '../utils/timescale-conversion';
class PassThroughRemuxer implements Remuxer {
private emitInitSegment: boolean = false;
private audioCodec?: string;
private videoCodec?: string;
private initData?: InitData;
private initPTS?: number;
private initPTS: RationalTimestamp | null = null;
private initTracks?: TrackSet;
private lastEndTime: number | null = null;
public destroy() {}
public resetTimeStamp(defaultInitPTS) {
public resetTimeStamp(defaultInitPTS: RationalTimestamp | null) {
this.initPTS = defaultInitPTS;
this.lastEndTime = null;
}
@ -64,7 +65,7 @@ class PassThroughRemuxer implements Remuxer {
private generateInitSegment(initSegment: Uint8Array | undefined): void {
let { audioCodec, videoCodec } = this;
if (!initSegment || !initSegment.byteLength) {
if (!initSegment?.byteLength) {
this.initTracks = undefined;
this.initData = undefined;
return;
@ -121,7 +122,8 @@ class PassThroughRemuxer implements Remuxer {
videoTrack: PassthroughTrack,
id3Track: DemuxedMetadataTrack,
textTrack: DemuxedUserdataTrack,
timeOffset: number
timeOffset: number,
accurateTimeOffset: boolean
): RemuxerResult {
let { initPTS, lastEndTime } = this;
const result: RemuxerResult = {
@ -142,7 +144,7 @@ class PassThroughRemuxer implements Remuxer {
// The binary segment data is added to the videoTrack in the mp4demuxer. We don't check to see if the data is only
// audio or video (or both); adding it to video was an arbitrary choice.
const data = videoTrack.samples;
if (!data || !data.length) {
if (!data?.length) {
return result;
}
@ -151,11 +153,11 @@ class PassThroughRemuxer implements Remuxer {
timescale: 1,
};
let initData = this.initData;
if (!initData || !initData.length) {
if (!initData?.length) {
this.generateInitSegment(data);
initData = this.initData;
}
if (!initData || !initData.length) {
if (!initData?.length) {
// We can't remux if the initSegment could not be generated
logger.warn('[passthrough-remuxer.ts]: Failed to generate initSegment.');
return result;
@ -165,17 +167,30 @@ class PassThroughRemuxer implements Remuxer {
this.emitInitSegment = false;
}
const duration = getDuration(data, initData);
const startDTS = getStartDTS(initData, data);
if (!Number.isFinite(initPTS!)) {
this.initPTS = initSegment.initPTS = initPTS = startDTS - timeOffset;
const decodeTime = startDTS === null ? timeOffset : startDTS;
if (
isInvalidInitPts(initPTS, decodeTime, timeOffset, duration) ||
(initSegment.timescale !== initPTS.timescale && accurateTimeOffset)
) {
initSegment.initPTS = decodeTime - timeOffset;
if (initPTS && initPTS.timescale === 1) {
logger.warn(
`Adjusting initPTS by ${initSegment.initPTS - initPTS.baseTime}`
);
}
this.initPTS = initPTS = {
baseTime: initSegment.initPTS,
timescale: 1,
};
}
const duration = getDuration(data, initData);
const startTime = audioTrack
? startDTS - (initPTS as number)
? decodeTime - initPTS.baseTime / initPTS.timescale
: (lastEndTime as number);
const endTime = startTime + duration;
offsetStartDTS(initData, data, initPTS as number);
offsetStartDTS(initData, data, initPTS.baseTime / initPTS.timescale);
if (duration > 0) {
this.lastEndTime = endTime;
@ -212,19 +227,18 @@ class PassThroughRemuxer implements Remuxer {
result.audio = track.type === 'audio' ? track : undefined;
result.video = track.type !== 'audio' ? track : undefined;
result.initSegment = initSegment;
const initPtsNum = this.initPTS ?? 0;
result.id3 = flushTextTrackMetadataCueSamples(
id3Track,
timeOffset,
initPtsNum,
initPtsNum
initPTS,
initPTS
);
if (textTrack.samples.length) {
result.text = flushTextTrackUserdataCueSamples(
textTrack,
timeOffset,
initPtsNum
initPTS
);
}
@ -232,6 +246,21 @@ class PassThroughRemuxer implements Remuxer {
}
}
function isInvalidInitPts(
initPTS: RationalTimestamp | null,
startDTS: number,
timeOffset: number,
duration: number
): initPTS is null {
if (initPTS === null) {
return true;
}
// InitPTS is invalid when distance from program would be more than segment duration or a minimum of one second
const minDuration = Math.max(duration, 1);
const startTime = startDTS - initPTS.baseTime / initPTS.timescale;
return Math.abs(startTime - timeOffset) > minDuration;
}
function getParsedTrackCodec(
track: InitDataTrack | undefined,
type: ElementaryStreamTypes.AUDIO | ElementaryStreamTypes.VIDEO
@ -244,7 +273,7 @@ function getParsedTrackCodec(
// Provide defaults based on codec type
// This allows for some playback of some fmp4 playlists without CODECS defined in manifest
if (parsedCodec === 'hvc1' || parsedCodec === 'hev1') {
return 'hvc1.1.c.L120.90';
return 'hvc1.1.6.L120.90';
}
if (parsedCodec === 'av01') {
return 'av01.0.04M.08';

16
node_modules/hls.js/src/task-loop.ts generated vendored
View file

@ -1,4 +1,5 @@
/**
* @ignore
* Sub-class specialization of EventHandler base class.
*
* TaskLoop allows to schedule a task function being called (optionnaly repeatedly) on the main loop,
@ -49,26 +50,21 @@ export default class TaskLoop {
protected onHandlerDestroyed() {}
/**
* @returns {boolean}
*/
public hasInterval(): boolean {
return !!this._tickInterval;
}
/**
* @returns {boolean}
*/
public hasNextTick(): boolean {
return !!this._tickTimer;
}
/**
* @param {number} millis Interval time (ms)
* @returns {boolean} True when interval has been scheduled, false when already scheduled (no effect)
* @param millis - Interval time (ms)
* @eturns True when interval has been scheduled, false when already scheduled (no effect)
*/
public setInterval(millis: number): boolean {
if (!this._tickInterval) {
this._tickCallCount = 0;
this._tickInterval = self.setInterval(this._boundTick, millis);
return true;
}
@ -76,7 +72,7 @@ export default class TaskLoop {
}
/**
* @returns {boolean} True when interval was cleared, false when none was set (no effect)
* @returns True when interval was cleared, false when none was set (no effect)
*/
public clearInterval(): boolean {
if (this._tickInterval) {
@ -88,7 +84,7 @@ export default class TaskLoop {
}
/**
* @returns {boolean} True when timeout was cleared, false when none was set (no effect)
* @returns True when timeout was cleared, false when none was set (no effect)
*/
public clearNextTick(): boolean {
if (this._tickTimer) {

View file

@ -6,7 +6,7 @@ export const CMCDVersion = 1;
/**
* CMCD Object Type
*/
export enum CMCDObjectType {
export const enum CMCDObjectType {
MANIFEST = 'm',
AUDIO = 'a',
VIDEO = 'v',
@ -21,17 +21,12 @@ export enum CMCDObjectType {
/**
* CMCD Streaming Format
*/
export enum CMCDStreamingFormat {
DASH = 'd',
HLS = 'h',
SMOOTH = 's',
OTHER = 'o',
}
export const CMCDStreamingFormatHLS = 'h';
/**
* CMCD Streaming Type
*/
export enum CMCDStreamType {
const enum CMCDStreamType {
VOD = 'v',
LIVE = 'l',
}
@ -212,7 +207,7 @@ export interface CMCD {
*
* If the streaming format being requested is unknown, then this key MUST NOT be used.
*/
sf?: CMCDStreamingFormat;
sf?: typeof CMCDStreamingFormatHLS;
/**
* Session ID

View file

@ -1,7 +1,14 @@
import EwmaBandWidthEstimator from '../utils/ewma-bandwidth-estimator';
export interface ComponentAPI {
destroy(): void;
}
export interface AbrComponentAPI extends ComponentAPI {
nextAutoLevel: number;
readonly bwEstimator?: EwmaBandWidthEstimator;
}
export interface NetworkComponentAPI extends ComponentAPI {
startLoad(startPosition: number): void;
stopLoad(): void;

View file

@ -1,3 +1,5 @@
import type { RationalTimestamp } from '../utils/timescale-conversion';
export interface Demuxer {
demux(
data: Uint8Array,
@ -18,7 +20,7 @@ export interface Demuxer {
videoCodec: string | undefined,
trackDuration: number
);
resetTimeStamp(defaultInitPTS?: number | null): void;
resetTimeStamp(defaultInitPTS?: RationalTimestamp | null): void;
resetContiguity(): void;
}
@ -70,8 +72,8 @@ export interface DemuxedVideoTrack extends DemuxedTrack {
height?: number;
pixelRatio?: [number, number];
audFound?: boolean;
pps?: number[];
sps?: number[];
pps?: Uint8Array[];
sps?: Uint8Array[];
naluState?: number;
samples: AvcSample[] | Uint8Array;
}
@ -88,7 +90,7 @@ export interface DemuxedUserdataTrack extends DemuxedTrack {
samples: UserdataSample[];
}
export enum MetadataSchema {
export const enum MetadataSchema {
audioId3 = 'org.id3',
dateRange = 'com.apple.quicktime.HLS',
emsg = 'https://aomedia.org/emsg/ID3',

View file

@ -3,7 +3,12 @@ import type { Fragment } from '../loader/fragment';
// eslint-disable-next-line import/no-duplicates
import type { Part } from '../loader/fragment';
import type { LevelDetails } from '../loader/level-details';
import type { HlsUrlParameters, Level, LevelParsed } from './level';
import type {
HlsUrlParameters,
Level,
LevelParsed,
VariableMap,
} from './level';
import type { MediaPlaylist, MediaPlaylistType } from './media-playlist';
import type {
Loader,
@ -21,8 +26,9 @@ import type { ErrorDetails, ErrorTypes } from '../errors';
import type { MetadataSample, UserdataSample } from './demuxer';
import type { AttrList } from '../utils/attr-list';
import type { HlsListeners } from '../events';
import { KeyLoaderInfo } from '../loader/key-loader';
import { LevelKey } from '../loader/level-key';
import type { KeyLoaderInfo } from '../loader/key-loader';
import type { LevelKey } from '../loader/level-key';
import type { IErrorAction } from '../controller/error-controller';
export interface MediaAttachingData {
media: HTMLMediaElement;
@ -78,16 +84,24 @@ export interface ManifestLoadingData {
url: string;
}
export type ContentSteeringOptions = {
uri: string;
pathwayId: string;
};
export interface ManifestLoadedData {
audioTracks: MediaPlaylist[];
captions?: MediaPlaylist[];
contentSteering: ContentSteeringOptions | null;
levels: LevelParsed[];
networkDetails: any;
sessionData: Record<string, AttrList> | null;
sessionKeys: LevelKey[] | null;
startTimeOffset: number | null;
stats: LoaderStats;
subtitles?: MediaPlaylist[];
url: string;
variableList: VariableMap | null;
}
export interface ManifestParsedData {
@ -158,17 +172,9 @@ export interface LevelPTSUpdatedData {
end: number;
}
export interface AudioTrackSwitchingData {
id: number;
name: string;
groupId: string;
type: MediaPlaylistType | 'main';
url: string;
}
export interface AudioTrackSwitchingData extends MediaPlaylist {}
export interface AudioTrackSwitchedData {
id: number;
}
export interface AudioTrackSwitchedData extends MediaPlaylist {}
export interface AudioTrackLoadedData extends TrackLoadedData {}
@ -217,23 +223,28 @@ export interface FPSDropLevelCappingData {
export interface ErrorData {
type: ErrorTypes;
details: ErrorDetails;
error: Error;
fatal: boolean;
errorAction?: IErrorAction;
buffer?: number;
bytes?: number;
chunkMeta?: ChunkMetadata;
context?: PlaylistLoaderContext;
error?: Error;
event?: keyof HlsListeners | 'demuxerWorker';
frag?: Fragment;
level?: number | undefined;
levelRetry?: boolean;
loader?: Loader<LoaderContext>;
networkDetails?: any;
stats?: LoaderStats;
mimeType?: string;
reason?: string;
response?: LoaderResponse;
url?: string;
parent?: PlaylistLevelType;
/**
* @deprecated Use ErrorData.error
*/
err?: {
// comes from transmuxer interface
message: string;
@ -352,6 +363,6 @@ export interface BackBufferData {
}
/**
* Deprecated; please use BackBufferData
* @deprecated Use BackBufferData
*/
export interface LiveBackBufferData extends BackBufferData {}

View file

@ -4,6 +4,9 @@ import type { FragLoadedData } from './events';
export interface FragmentEntity {
body: Fragment;
// appendedPTS is the latest buffered presentation time within the fragment's time range.
// It is used to determine: which fragment is appended at any given position, and hls.currentLevel.
appendedPTS: number | null;
loaded: FragLoadedData | null;
buffered: boolean;
range: { [key in SourceBufferName]: FragmentBufferedRange };

View file

@ -20,33 +20,28 @@ export interface LevelParsed {
export interface LevelAttributes extends AttrList {
'ALLOWED-CPC'?: string;
AUDIO?: string;
AUTOSELECT?: string;
'AVERAGE-BANDWIDTH'?: string;
BANDWIDTH?: string;
BYTERANGE?: string;
'CLOSED-CAPTIONS'?: string;
CHARACTERISTICS?: string;
CODECS?: string;
DEFAULT?: string;
FORCED?: string;
'FRAME-RATE'?: string;
'HDCP-LEVEL'?: string;
LANGUAGE?: string;
NAME?: string;
'HDCP-LEVEL'?: 'TYPE-0' | 'TYPE-1' | 'NONE';
'PATHWAY-ID'?: string;
'PROGRAM-ID'?: string;
RESOLUTION?: string;
SCORE?: string;
'STABLE-VARIANT-ID'?: string;
SUBTITLES?: string;
TYPE?: string;
URI?: string;
'VIDEO-RANGE'?: string;
'SUPPLEMENTAL-CODECS'?: string;
VIDEO?: string;
'VIDEO-RANGE'?: 'SDR' | 'HLG' | 'PQ';
}
export const HdcpLevels = ['NONE', 'TYPE-0', 'TYPE-1', 'TYPE-2', null] as const;
export type HdcpLevel = typeof HdcpLevels[number];
export const HdcpLevels = ['NONE', 'TYPE-0', 'TYPE-1', null] as const;
export type HdcpLevel = (typeof HdcpLevels)[number];
export enum HlsSkip {
export type VariableMap = Record<string, string>;
export const enum HlsSkip {
No = '',
Yes = 'YES',
v2 = 'v2',
@ -91,7 +86,7 @@ export class HlsUrlParameters {
}
export class Level {
public readonly attrs: LevelAttributes;
public readonly _attrs: LevelAttributes[];
public readonly audioCodec: string | undefined;
public readonly bitrate: number;
public readonly codecSet: string;
@ -101,19 +96,19 @@ export class Level {
public readonly videoCodec: string | undefined;
public readonly width: number;
public readonly unknownCodecs: string[] | undefined;
public audioGroupIds?: string[];
public audioGroupIds?: (string | undefined)[];
public details?: LevelDetails;
public fragmentError: number = 0;
public loadError: number = 0;
public loaded?: { bytes: number; duration: number };
public realBitrate: number = 0;
public textGroupIds?: string[];
public textGroupIds?: (string | undefined)[];
public url: string[];
private _urlId: number = 0;
constructor(data: LevelParsed) {
this.url = [data.url];
this.attrs = data.attrs;
this._attrs = [data.attrs];
this.bitrate = data.bitrate;
if (data.details) {
this.details = data.details;
@ -135,6 +130,14 @@ export class Level {
return Math.max(this.realBitrate, this.bitrate);
}
get attrs(): LevelAttributes {
return this._attrs[this._urlId];
}
get pathwayId(): string {
return this.attrs['PATHWAY-ID'] || '.';
}
get uri(): string {
return this.url[this._urlId] || '';
}
@ -146,8 +149,23 @@ export class Level {
set urlId(value: number) {
const newValue = value % this.url.length;
if (this._urlId !== newValue) {
this.fragmentError = 0;
this.loadError = 0;
this.details = undefined;
this._urlId = newValue;
}
}
get audioGroupId(): string | undefined {
return this.audioGroupIds?.[this.urlId];
}
get textGroupId(): string | undefined {
return this.textGroupIds?.[this.urlId];
}
addFallback(data: LevelParsed) {
this.url.push(data.url);
this._attrs.push(data.attrs);
}
}

View file

@ -1,3 +1,4 @@
import type { LoaderConfig } from '../config';
import type { Fragment } from '../loader/fragment';
import type { Part } from '../loader/fragment';
import type { KeyLoaderInfo } from '../loader/key-loader';
@ -31,23 +32,41 @@ export interface KeyLoaderContext extends LoaderContext {
}
export interface LoaderConfiguration {
// LoaderConfig policy that overrides required settings
loadPolicy: LoaderConfig;
/**
* @deprecated use LoaderConfig timeoutRetry and errorRetry maxNumRetry
*/
// Max number of load retries
maxRetry: number;
/**
* @deprecated use LoaderConfig maxTimeToFirstByteMs and maxLoadTimeMs
*/
// Timeout after which `onTimeOut` callback will be triggered
// (if loading is still not finished after that delay)
// when loading has not finished after that delay
timeout: number;
/**
* @deprecated use LoaderConfig timeoutRetry and errorRetry retryDelayMs
*/
// Delay between an I/O error and following connection retry (ms).
// This to avoid spamming the server
retryDelay: number;
/**
* @deprecated use LoaderConfig timeoutRetry and errorRetry maxRetryDelayMs
*/
// max connection retry delay (ms)
maxRetryDelay: number;
// When streaming progressively, this is the minimum chunk size required to emit a PROGRESS event
highWaterMark: number;
highWaterMark?: number;
}
export interface LoaderResponse {
url: string;
data: string | ArrayBuffer;
data?: string | ArrayBuffer | Object;
// Errors can include HTTP status code and error message
// Successful responses should include status code 200
code?: number;
text?: string;
}
export interface LoaderStats {
@ -98,7 +117,8 @@ export type LoaderOnError<T extends LoaderContext> = (
text: string;
},
context: T,
networkDetails: any
networkDetails: any,
stats: LoaderStats
) => void;
export type LoaderOnTimeout<T extends LoaderContext> = (
@ -139,33 +159,32 @@ export interface Loader<T extends LoaderContext> {
* @returns time object being lodaded
*/
getCacheAge?: () => number | null;
getResponseHeader?: (name: string) => string | null;
context: T;
stats: LoaderStats;
}
export enum PlaylistContextType {
export const enum PlaylistContextType {
MANIFEST = 'manifest',
LEVEL = 'level',
AUDIO_TRACK = 'audioTrack',
SUBTITLE_TRACK = 'subtitleTrack',
}
export enum PlaylistLevelType {
export const enum PlaylistLevelType {
MAIN = 'main',
AUDIO = 'audio',
SUBTITLE = 'subtitle',
}
export interface PlaylistLoaderContext extends LoaderContext {
loader?: Loader<PlaylistLoaderContext>;
type: PlaylistContextType;
// the level index to load
level: number | null;
// level or track id from LevelLoadingData / TrackLoadingData
id: number | null;
// track group id
groupId: string | null;
groupId?: string;
// internal representation of a parsed m3u8 level playlist
levelDetails?: LevelDetails;
// Blocking playlist request delivery directives (or null id none were added to playlist url

View file

@ -1,5 +1,5 @@
import type { LevelParsed } from './level';
import type { AttrList } from '../utils/attr-list';
export interface AudioGroup {
id?: string;
codec?: string;
@ -14,11 +14,12 @@ export type SubtitlePlaylistType = 'SUBTITLES' | 'CLOSED-CAPTIONS';
export type MediaPlaylistType = MainPlaylistType | SubtitlePlaylistType;
// audioTracks, captions and subtitles returned by `M3U8Parser.parseMasterPlaylistMedia`
export interface MediaPlaylist extends LevelParsed {
export interface MediaPlaylist extends Omit<LevelParsed, 'attrs'> {
attrs: MediaAttributes;
autoselect: boolean; // implicit false if not present
default: boolean; // implicit false if not present
forced: boolean; // implicit false if not present
groupId?: string; // not optional in HLS playlists, but it isn't always specified.
groupId: string; // required in HLS playlists
id: number; // incrementing number to track media playlists
instreamId?: string;
lang?: string;
@ -26,3 +27,20 @@ export interface MediaPlaylist extends LevelParsed {
// 'main' is a custom type added to signal a audioCodec in main track?; see playlist-loader~L310
type: MediaPlaylistType | 'main';
}
export interface MediaAttributes extends AttrList {
'ASSOC-LANGUAGE'?: string;
AUTOSELECT?: 'YES' | 'NO';
CHANNELS?: string;
CHARACTERISTICS?: string;
DEFAULT?: 'YES' | 'NO';
FORCED?: 'YES' | 'NO';
'GROUP-ID': string;
'INSTREAM-ID'?: string;
LANGUAGE?: string;
NAME: string;
'PATHWAY-ID'?: string;
'STABLE-RENDITION-ID'?: string;
TYPE?: 'AUDIO' | 'VIDEO' | 'SUBTITLES' | 'CLOSED-CAPTIONS';
URI?: string;
}

View file

@ -10,6 +10,7 @@ import {
import type { SourceBufferName } from './buffer';
import type { PlaylistLevelType } from './loader';
import type { DecryptData } from '../loader/level-key';
import type { RationalTimestamp } from '../utils/timescale-conversion';
export interface Remuxer {
remux(
@ -28,7 +29,7 @@ export interface Remuxer {
videoCodec: string | undefined,
decryptdata: DecryptData | null
): void;
resetTimeStamp(defaultInitPTS): void;
resetTimeStamp(defaultInitPTS: RationalTimestamp | null): void;
resetNextTimestamp(): void;
destroy(): void;
}

View file

@ -1,5 +1,5 @@
const DECIMAL_RESOLUTION_REGEX = /^(\d+)x(\d+)$/; // eslint-disable-line no-useless-escape
const ATTR_LIST_REGEX = /\s*(.+?)\s*=((?:\".*?\")|.*?)(?:,|$)/g; // eslint-disable-line no-useless-escape
const DECIMAL_RESOLUTION_REGEX = /^(\d+)x(\d+)$/;
const ATTR_LIST_REGEX = /(.+?)=(".*?"|.*?)(?:,|$)/g;
// adapted from https://github.com/kanongil/node-m3u8parse/blob/master/attrlist.js
export class AttrList {
@ -12,6 +12,10 @@ export class AttrList {
for (const attr in attrs) {
if (attrs.hasOwnProperty(attr)) {
if (attr.substring(0, 2) === 'X-') {
this.clientAttrs = this.clientAttrs || [];
this.clientAttrs.push(attr);
}
this[attr] = attrs[attr];
}
}
@ -99,8 +103,8 @@ export class AttrList {
) {
value = value.slice(1, -1);
}
attrs[match[1]] = value;
const name = match[1].trim();
attrs[name] = value;
}
return attrs;
}

View file

@ -6,15 +6,15 @@ const BinarySearch = {
* This requires the condition to only match one item in the array,
* and for the array to be ordered.
*
* @param {Array<T>} list The array to search.
* @param {BinarySearchComparison<T>} comparisonFn
* @param list The array to search.
* @param comparisonFn
* Called and provided a candidate item as the first argument.
* Should return:
* > -1 if the item should be located at a lower index than the provided item.
* > 1 if the item should be located at a higher index than the provided item.
* > 0 if the item is the item you're looking for.
*
* @return {T | null} The object if it is found or null otherwise.
* @returns the object if found, otherwise returns null
*/
search: function <T>(
list: T[],

View file

@ -1,7 +1,5 @@
/**
* @module BufferHelper
*
* Providing methods dealing with buffer length retrieval for example.
* Provides methods dealing with buffer length retrieval for example.
*
* In general, a helper around HTML5 MediaElement TimeRanges gathered from `buffered` property.
*
@ -35,9 +33,6 @@ const noopBuffered: TimeRanges = {
export class BufferHelper {
/**
* Return true if `media`'s buffered include `position`
* @param {Bufferable} media
* @param {number} position
* @returns {boolean}
*/
static isBuffered(media: Bufferable, position: number): boolean {
try {

View file

@ -208,7 +208,7 @@ const backgroundColors = [
'transparent',
];
enum VerboseLevel {
const enum VerboseLevel {
ERROR = 0,
TEXT = 1,
WARNING = 2,
@ -1159,9 +1159,9 @@ class Cea608Parser {
/**
* Parse Command.
* @returns {Boolean} Tells if a command was found
* @returns True if a command was found
*/
parseCmd(a: number, b: number) {
parseCmd(a: number, b: number): boolean {
const { cmdHistory } = this;
const cond1 =
(a === 0x14 || a === 0x1c || a === 0x15 || a === 0x1d) &&
@ -1229,9 +1229,8 @@ class Cea608Parser {
/**
* Parse midrow styling command
* @returns {Boolean}
*/
parseMidrow(a: number, b: number) {
parseMidrow(a: number, b: number): boolean {
let chNr: number = 0;
if ((a === 0x11 || a === 0x19) && b >= 0x20 && b <= 0x2f) {
@ -1304,7 +1303,7 @@ class Cea608Parser {
/**
* Interpret the second byte of the pac, and return the information.
* @returns {Object} pacData with style parameters.
* @returns pacData with style parameters
*/
interpretPAC(row: number, byte: number): PACData {
let pacIndex;
@ -1391,7 +1390,7 @@ class Cea608Parser {
/**
* Parse extended background attributes as well as new foreground color black.
* @returns {Boolean} Tells if background attributes are found
* @returns True if background attributes are found
*/
parseBackgroundAttributes(a: number, b: number): boolean {
const case1 = (a === 0x10 || a === 0x18) && b >= 0x20 && b <= 0x2f;

View file

@ -1,3 +1,5 @@
import { getMediaSource } from './mediasource-helper';
// from http://mp4ra.org/codecs.html
const sampleEntryCodesISO = {
audio: {
@ -71,6 +73,8 @@ const sampleEntryCodesISO = {
},
};
const MediaSource = getMediaSource();
export type CodecType = 'audio' | 'video';
export function isCodecType(codec: string, type: CodecType): boolean {
@ -79,7 +83,8 @@ export function isCodecType(codec: string, type: CodecType): boolean {
}
export function isCodecSupportedInMp4(codec: string, type: CodecType): boolean {
return MediaSource.isTypeSupported(
`${type || 'video'}/mp4;codecs="${codec}"`
return (
MediaSource?.isTypeSupported(`${type || 'video'}/mp4;codecs="${codec}"`) ??
false
);
}

View file

@ -63,7 +63,7 @@ const Cues: CuesInterface = {
const id = generateCueId(startTime, endTime, cueText);
// If this cue already exists in the track do not push it
if (!track || !track.cues || !track.cues.getCueById(id)) {
if (!track?.cues?.getCueById(id)) {
cue = new Cue(startTime, endTime, cueText);
cue.id = id;
cue.line = r + 1;

View file

@ -14,23 +14,35 @@ class EwmaBandWidthEstimator {
private minDelayMs_: number;
private slow_: EWMA;
private fast_: EWMA;
private defaultTTFB_: number;
private ttfb_: EWMA;
constructor(slow: number, fast: number, defaultEstimate: number) {
constructor(
slow: number,
fast: number,
defaultEstimate: number,
defaultTTFB: number = 100
) {
this.defaultEstimate_ = defaultEstimate;
this.minWeight_ = 0.001;
this.minDelayMs_ = 50;
this.slow_ = new EWMA(slow);
this.fast_ = new EWMA(fast);
this.defaultTTFB_ = defaultTTFB;
this.ttfb_ = new EWMA(slow);
}
update(slow: number, fast: number) {
const { slow_, fast_ } = this;
if (this.slow_.halfLife !== slow) {
const { slow_, fast_, ttfb_ } = this;
if (slow_.halfLife !== slow) {
this.slow_ = new EWMA(slow, slow_.getEstimate(), slow_.getTotalWeight());
}
if (this.fast_.halfLife !== fast) {
if (fast_.halfLife !== fast) {
this.fast_ = new EWMA(fast, fast_.getEstimate(), fast_.getTotalWeight());
}
if (ttfb_.halfLife !== slow) {
this.ttfb_ = new EWMA(slow, ttfb_.getEstimate(), ttfb_.getTotalWeight());
}
}
sample(durationMs: number, numBytes: number) {
@ -44,9 +56,16 @@ class EwmaBandWidthEstimator {
this.slow_.sample(durationS, bandwidthInBps);
}
sampleTTFB(ttfb: number) {
// weight is frequency curve applied to TTFB in seconds
// (longer times have less weight with expected input under 1 second)
const seconds = ttfb / 1000;
const weight = Math.sqrt(2) * Math.exp(-Math.pow(seconds, 2) / 2);
this.ttfb_.sample(weight, Math.max(ttfb, 5));
}
canEstimate(): boolean {
const fast = this.fast_;
return fast && fast.getTotalWeight() >= this.minWeight_;
return this.fast_.getTotalWeight() >= this.minWeight_;
}
getEstimate(): number {
@ -61,6 +80,14 @@ class EwmaBandWidthEstimator {
}
}
getEstimateTTFB(): number {
if (this.ttfb_.getTotalWeight() >= this.minWeight_) {
return this.ttfb_.getEstimate();
} else {
return this.defaultTTFB_;
}
}
destroy() {}
}
export default EwmaBandWidthEstimator;

View file

@ -5,6 +5,7 @@ import {
LoaderStats,
LoaderConfiguration,
LoaderOnProgress,
LoaderResponse,
} from '../types/loader';
import { LoadStats } from '../loader/load-stats';
import ChunkCache from '../demux/chunk-cache';
@ -27,6 +28,8 @@ export function fetchSupported() {
return false;
}
const BYTERANGE = /(\d+)-(\d+)\/(\d+)/;
class FetchLoader implements Loader<LoaderContext> {
private fetchSetup: Function;
private requestTimeout?: number;
@ -52,7 +55,7 @@ class FetchLoader implements Loader<LoaderContext> {
abortInternal(): void {
const response = this.response;
if (!response || !response.ok) {
if (!response?.ok) {
this.stats.aborted = true;
this.controller.abort();
}
@ -81,12 +84,17 @@ class FetchLoader implements Loader<LoaderContext> {
callbacks.onProgress;
const isArrayBuffer = context.responseType === 'arraybuffer';
const LENGTH = isArrayBuffer ? 'byteLength' : 'length';
const { maxTimeToFirstByteMs, maxLoadTimeMs } = config.loadPolicy;
this.context = context;
this.config = config;
this.callbacks = callbacks;
this.request = this.fetchSetup(context, initParams);
self.clearTimeout(this.requestTimeout);
config.timeout =
maxTimeToFirstByteMs && Number.isFinite(maxTimeToFirstByteMs)
? maxTimeToFirstByteMs
: maxLoadTimeMs;
this.requestTimeout = self.setTimeout(() => {
this.abortInternal();
callbacks.onTimeout(stats, context, this.response);
@ -97,6 +105,15 @@ class FetchLoader implements Loader<LoaderContext> {
.then((response: Response): Promise<string | ArrayBuffer> => {
this.response = this.loader = response;
const first = Math.max(self.performance.now(), stats.loading.start);
self.clearTimeout(this.requestTimeout);
config.timeout = maxLoadTimeMs;
this.requestTimeout = self.setTimeout(() => {
this.abortInternal();
callbacks.onTimeout(stats, context, this.response);
}, maxLoadTimeMs - (first - stats.loading.start));
if (!response.ok) {
const { status, statusText } = response;
throw new FetchError(
@ -105,11 +122,9 @@ class FetchLoader implements Loader<LoaderContext> {
response
);
}
stats.loading.first = Math.max(
self.performance.now(),
stats.loading.start
);
stats.total = parseInt(response.headers.get('Content-Length') || '0');
stats.loading.first = first;
stats.total = getContentLength(response.headers) || stats.total;
if (onProgress && Number.isFinite(config.highWaterMark)) {
return this.loadProgressively(
@ -124,6 +139,9 @@ class FetchLoader implements Loader<LoaderContext> {
if (isArrayBuffer) {
return response.arrayBuffer();
}
if (context.responseType === 'json') {
return response.json();
}
return response.text();
})
.then((responseData: string | ArrayBuffer) => {
@ -138,9 +156,10 @@ class FetchLoader implements Loader<LoaderContext> {
stats.loaded = stats.total = total;
}
const loaderResponse = {
const loaderResponse: LoaderResponse = {
url: response.url,
data: responseData,
code: response.status,
};
if (onProgress && !Number.isFinite(config.highWaterMark)) {
@ -161,7 +180,8 @@ class FetchLoader implements Loader<LoaderContext> {
callbacks.onError(
{ code, text },
context,
error ? error.details : null
error ? error.details : null,
stats
);
});
}
@ -175,6 +195,10 @@ class FetchLoader implements Loader<LoaderContext> {
return result;
}
getResponseHeader(name: string): string | null {
return this.response ? this.response.headers.get(name) : null;
}
private loadProgressively(
response: Response,
stats: LoaderStats,
@ -243,6 +267,27 @@ function getRequestParameters(context: LoaderContext, signal): any {
return initParams;
}
function getByteRangeLength(byteRangeHeader: string): number | undefined {
const result = BYTERANGE.exec(byteRangeHeader);
if (result) {
return parseInt(result[2]) - parseInt(result[1]) + 1;
}
}
function getContentLength(headers: Headers): number | undefined {
const contentRange = headers.get('Content-Range');
if (contentRange) {
const byteRangeLength = getByteRangeLength(contentRange);
if (Number.isFinite(byteRangeLength)) {
return byteRangeLength;
}
}
const contentLength = headers.get('Content-Length');
if (contentLength) {
return parseInt(contentLength);
}
}
function getRequest(context: LoaderContext, initParams: any): Request {
return new self.Request(context.url, initParams);
}

View file

@ -2,7 +2,10 @@ import { findBox } from './mp4-tools';
import { parseTimeStamp } from './vttparser';
import VTTCue from './vttcue';
import { utf8ArrayToStr } from '../demux/id3';
import { toTimescaleFromScale } from './timescale-conversion';
import {
RationalTimestamp,
toTimescaleFromScale,
} from './timescale-conversion';
import { generateCueId } from './webvtt-parser';
export const IMSC1_CODEC = 'stpp.ttml.im1t';
@ -23,8 +26,7 @@ const textAlignToLineAlign: Partial<Record<string, LineAlignSetting>> = {
export function parseIMSC1(
payload: ArrayBuffer,
initPTS: number,
timescale: number,
initPTS: RationalTimestamp,
callBack: (cues: Array<VTTCue>) => any,
errorCallBack: (error: Error) => any
) {
@ -36,7 +38,7 @@ export function parseIMSC1(
const ttmlList = results.map((mdat) => utf8ArrayToStr(mdat));
const syncTime = toTimescaleFromScale(initPTS, 1, timescale);
const syncTime = toTimescaleFromScale(initPTS.baseTime, 1, initPTS.timescale);
try {
ttmlList.forEach((ttml) => callBack(parseTTML(ttml, syncTime)));

View file

@ -71,7 +71,9 @@ export function enableLogs(debugConfig: boolean | ILogger, id: string): void {
// Some browsers don't allow to use bind on console object anyway
// fallback to default if needed
try {
exportedLogger.log(`Debug logs enabled for "${id}"`);
exportedLogger.log(
`Debug logs enabled for "${id}" in hls.js version ${__VERSION__}`
);
} catch (e) {
exportedLogger = fakeLogger;
}

View file

@ -3,7 +3,7 @@ import type { DRMSystemOptions, EMEControllerConfig } from '../config';
/**
* @see https://developer.mozilla.org/en-US/docs/Web/API/Navigator/requestMediaKeySystemAccess
*/
export enum KeySystems {
export const enum KeySystems {
CLEARKEY = 'org.w3.clearkey',
FAIRPLAY = 'com.apple.fps',
PLAYREADY = 'com.microsoft.playready',
@ -11,7 +11,7 @@ export enum KeySystems {
}
// Playlist #EXT-X-KEY KEYFORMAT values
export enum KeySystemFormats {
export const enum KeySystemFormats {
CLEARKEY = 'org.w3.clearkey',
FAIRPLAY = 'com.apple.streamingkeydelivery',
PLAYREADY = 'com.microsoft.playready',
@ -34,7 +34,7 @@ export function keySystemFormatToKeySystemDomain(
}
// System IDs for which we can extract a key ID from "encrypted" event PSSH
export enum KeySystemIds {
export const enum KeySystemIds {
// CENC = '1077efecc0b24d02ace33c1e52e2fb4b'
// CLEARKEY = 'e2719d58a985b3c9781ab030af78d30e',
// FAIRPLAY = '94ce86fb07ff4f43adb893d2fa968ca2',

View file

@ -3,5 +3,6 @@
*/
export function getMediaSource(): typeof MediaSource | undefined {
if (typeof self === 'undefined') return undefined;
return self.MediaSource || ((self as any).WebKitMediaSource as MediaSource);
}

View file

@ -136,8 +136,7 @@ export function parseSegmentIndex(sidx: Uint8Array): SidxInfo | null {
const referenceType = (referenceInfo & 0x80000000) >>> 31;
if (referenceType === 1) {
// eslint-disable-next-line no-console
console.warn('SIDX has hierarchical references (not supported)');
logger.warn('SIDX has hierarchical references (not supported)');
return null;
}
@ -188,8 +187,8 @@ export function parseSegmentIndex(sidx: Uint8Array): SidxInfo | null {
* moov > trak > mdia > mdhd.timescale
* moov > trak > mdia > hdlr
* ```
* @param initSegment {Uint8Array} the bytes of the init segment
* @return {InitData} a hash of track type to timescale values or null if
* @param initSegment the bytes of the init segment
* @returns a hash of track type to timescale values or null if
* the init segment is malformed.
*/
@ -346,15 +345,18 @@ export function parseSinf(sinf: Uint8Array): Uint8Array | null {
* ```
* It requires the timescale value from the mdhd to interpret.
*
* @param initData {InitData} a hash of track type to timescale values
* @param fmp4 {Uint8Array} the bytes of the mp4 fragment
* @return {number} the earliest base media decode start time for the
* @param initData - a hash of track type to timescale values
* @param fmp4 - the bytes of the mp4 fragment
* @returns the earliest base media decode start time for the
* fragment, in seconds
*/
export function getStartDTS(initData: InitData, fmp4: Uint8Array): number {
export function getStartDTS(
initData: InitData,
fmp4: Uint8Array
): number | null {
// we need info from two children of each track fragment box
return (
findBox(fmp4, ['moof', 'traf']).reduce((result: number | null, traf) => {
return findBox(fmp4, ['moof', 'traf']).reduce(
(result: number | null, traf) => {
const tfdt = findBox(traf, ['tfdt'])[0];
const version = tfdt[0];
const start = findBox(traf, ['tfhd']).reduce(
@ -365,7 +367,16 @@ export function getStartDTS(initData: InitData, fmp4: Uint8Array): number {
if (track) {
let baseTime = readUint32(tfdt, 4);
if (version === 1) {
baseTime *= Math.pow(2, 32);
// If value is too large, assume signed 64-bit. Negative track fragment decode times are invalid, but they exist in the wild.
// This prevents large values from being used for initPTS, which can cause playlist sync issues.
// https://github.com/video-dev/hls.js/issues/5303
if (baseTime === UINT32_MAX) {
logger.warn(
`[mp4-demuxer]: Ignoring assumed invalid signed 64-bit track fragment decode time`
);
return result;
}
baseTime *= UINT32_MAX + 1;
baseTime += readUint32(tfdt, 8);
}
// assume a 90kHz clock if no timescale was specified
@ -391,7 +402,8 @@ export function getStartDTS(initData: InitData, fmp4: Uint8Array): number {
return start;
}
return result;
}, null) || 0
},
null
);
}
@ -991,8 +1003,7 @@ export function parseEmsg(data: Uint8Array): IEmsgParsingData {
presentationTime = 2 ** 32 * leftPresentationTime + rightPresentationTime;
if (!Number.isSafeInteger(presentationTime)) {
presentationTime = Number.MAX_SAFE_INTEGER;
// eslint-disable-next-line no-console
console.warn(
logger.warn(
'Presentation time exceeds safe integer limit and wrapped to max safe integer in parsing emsg box'
);
}

View file

@ -1,34 +1,39 @@
const MPEG_TS_CLOCK_FREQ_HZ = 90000;
export type RationalTimestamp = {
baseTime: number; // ticks
timescale: number; // ticks per second
};
export function toTimescaleFromBase(
value,
baseTime: number,
destScale: number,
srcBase: number = 1,
round: boolean = false
): number {
const result = value * destScale * srcBase; // equivalent to `(value * scale) / (1 / base)`
const result = baseTime * destScale * srcBase; // equivalent to `(value * scale) / (1 / base)`
return round ? Math.round(result) : result;
}
export function toTimescaleFromScale(
value,
baseTime: number,
destScale: number,
srcScale: number = 1,
round: boolean = false
): number {
return toTimescaleFromBase(value, destScale, 1 / srcScale, round);
return toTimescaleFromBase(baseTime, destScale, 1 / srcScale, round);
}
export function toMsFromMpegTsClock(
value: number,
baseTime: number,
round: boolean = false
): number {
return toTimescaleFromBase(value, 1000, 1 / MPEG_TS_CLOCK_FREQ_HZ, round);
return toTimescaleFromBase(baseTime, 1000, 1 / MPEG_TS_CLOCK_FREQ_HZ, round);
}
export function toMpegTsClockFromTimescale(
value: number,
baseTime: number,
srcScale: number = 1
): number {
return toTimescaleFromBase(value, MPEG_TS_CLOCK_FREQ_HZ, 1 / srcScale);
return toTimescaleFromBase(baseTime, MPEG_TS_CLOCK_FREQ_HZ, 1 / srcScale);
}

View file

@ -26,7 +26,7 @@ export default (function () {
}
const AllowedDirections = ['', 'lr', 'rl'] as const;
type Direction = typeof AllowedDirections[number];
type Direction = (typeof AllowedDirections)[number];
const AllowedAlignments = [
'start',
@ -35,7 +35,7 @@ export default (function () {
'left',
'right',
] as const;
type Alignment = typeof AllowedAlignments[number];
type Alignment = (typeof AllowedAlignments)[number];
function isAllowedValue<T, A>(allowed: T, value: string): A | false {
if (typeof value !== 'string') {

View file

@ -346,7 +346,7 @@ export class VTTParser {
// strip of UTF-8 BOM if any
// https://en.wikipedia.org/wiki/Byte_order_mark#UTF-8
const m = line.match(/^()?WEBVTT([ \t].*)?$/);
if (!m || !m[0]) {
if (!m?.[0]) {
throw new Error('Malformed WebVTT signature.');
}

View file

@ -1,6 +1,9 @@
import { VTTParser } from './vttparser';
import { utf8ArrayToStr } from '../demux/id3';
import { toMpegTsClockFromTimescale } from './timescale-conversion';
import {
RationalTimestamp,
toMpegTsClockFromTimescale,
} from './timescale-conversion';
import { normalizePts } from '../remux/mp4-remuxer';
import type { VTTCCs } from '../types/vtt';
@ -89,8 +92,7 @@ const calculateOffset = function (vttCCs: VTTCCs, cc, presentationTime) {
export function parseWebVTT(
vttByteArray: ArrayBuffer,
initPTS: number,
timescale: number,
initPTS: RationalTimestamp,
vttCCs: VTTCCs,
cc: number,
timeOffset: number,
@ -105,7 +107,10 @@ export function parseWebVTT(
.replace(LINEBREAKS, '\n')
.split('\n');
const cues: VTTCue[] = [];
const initPTS90Hz = toMpegTsClockFromTimescale(initPTS, timescale);
const init90kHz = toMpegTsClockFromTimescale(
initPTS.baseTime,
initPTS.timescale
);
let cueTime = '00:00.000';
let timestampMapMPEGTS = 0;
let timestampMapLOCAL = 0;
@ -118,7 +123,7 @@ export function parseWebVTT(
let cueOffset = vttCCs.ccOffset;
// Calculate subtitle PTS offset
const webVttMpegTsMapOffset = (timestampMapMPEGTS - initPTS90Hz) / 90000;
const webVttMpegTsMapOffset = (timestampMapMPEGTS - init90kHz) / 90000;
// Update offsets for new discontinuities
if (currCC?.new) {

View file

@ -5,13 +5,18 @@ import type {
LoaderStats,
Loader,
LoaderConfiguration,
LoaderResponse,
} from '../types/loader';
import { LoadStats } from '../loader/load-stats';
import { type HlsConfig, RetryConfig } from '../config';
import { getRetryDelay, shouldRetry } from './error-helper';
const AGE_HEADER_LINE_REGEX = /^age:\s*[\d.]+\s*$/m;
const AGE_HEADER_LINE_REGEX = /^age:\s*[\d.]+\s*$/im;
class XhrLoader implements Loader<LoaderContext> {
private xhrSetup: Function | null;
private xhrSetup:
| ((xhr: XMLHttpRequest, url: string) => Promise<void> | void)
| null;
private requestTimeout?: number;
private retryTimeout?: number;
private retryDelay: number;
@ -22,20 +27,20 @@ class XhrLoader implements Loader<LoaderContext> {
private loader: XMLHttpRequest | null = null;
public stats: LoaderStats;
constructor(config /* HlsConfig */) {
this.xhrSetup = config ? config.xhrSetup : null;
constructor(config: HlsConfig) {
this.xhrSetup = config ? config.xhrSetup || null : null;
this.stats = new LoadStats();
this.retryDelay = 0;
}
destroy(): void {
destroy() {
this.callbacks = null;
this.abortInternal();
this.loader = null;
this.config = null;
}
abortInternal(): void {
abortInternal() {
const loader = this.loader;
self.clearTimeout(this.requestTimeout);
self.clearTimeout(this.retryTimeout);
@ -49,7 +54,7 @@ class XhrLoader implements Loader<LoaderContext> {
}
}
abort(): void {
abort() {
this.abortInternal();
if (this.callbacks?.onAbort) {
this.callbacks.onAbort(this.stats, this.context, this.loader);
@ -60,7 +65,7 @@ class XhrLoader implements Loader<LoaderContext> {
context: LoaderContext,
config: LoaderConfiguration,
callbacks: LoaderCallbacks<LoaderContext>
): void {
) {
if (this.stats.loading.start) {
throw new Error('Loader can only be used once.');
}
@ -68,11 +73,10 @@ class XhrLoader implements Loader<LoaderContext> {
this.context = context;
this.config = config;
this.callbacks = callbacks;
this.retryDelay = config.retryDelay;
this.loadInternal();
}
loadInternal(): void {
loadInternal() {
const { config, context } = this;
if (!config) {
return;
@ -84,35 +88,50 @@ class XhrLoader implements Loader<LoaderContext> {
stats.loaded = 0;
const xhrSetup = this.xhrSetup;
try {
if (xhrSetup) {
try {
xhrSetup(xhr, context.url);
} catch (e) {
// fix xhrSetup: (xhr, url) => {xhr.setRequestHeader("Content-Language", "test");}
// not working, as xhr.setRequestHeader expects xhr.readyState === OPEN
if (xhrSetup) {
Promise.resolve()
.then(() => {
if (this.stats.aborted) return;
return xhrSetup(xhr, context.url);
})
.catch((error: Error) => {
xhr.open('GET', context.url, true);
xhrSetup(xhr, context.url);
}
}
if (!xhr.readyState) {
xhr.open('GET', context.url, true);
}
return xhrSetup(xhr, context.url);
})
.then(() => {
if (this.stats.aborted) return;
this.openAndSendXhr(xhr, context, config);
})
.catch((error: Error) => {
// IE11 throws an exception on xhr.open if attempting to access an HTTP resource over HTTPS
this.callbacks!.onError(
{ code: xhr.status, text: error.message },
context,
xhr,
stats
);
return;
});
} else {
this.openAndSendXhr(xhr, context, config);
}
}
const headers = this.context.headers;
if (headers) {
for (const header in headers) {
xhr.setRequestHeader(header, headers[header]);
}
openAndSendXhr(
xhr: XMLHttpRequest,
context: LoaderContext,
config: LoaderConfiguration
) {
if (!xhr.readyState) {
xhr.open('GET', context.url, true);
}
const headers = this.context.headers;
const { maxTimeToFirstByteMs, maxLoadTimeMs } = config.loadPolicy;
if (headers) {
for (const header in headers) {
xhr.setRequestHeader(header, headers[header]);
}
} catch (e) {
// IE11 throws an exception on xhr.open if attempting to access an HTTP resource over HTTPS
this.callbacks!.onError(
{ code: xhr.status, text: e.message },
context,
xhr
);
return;
}
if (context.rangeEnd) {
@ -127,6 +146,10 @@ class XhrLoader implements Loader<LoaderContext> {
xhr.responseType = context.responseType as XMLHttpRequestResponseType;
// setup timeout before we perform request
self.clearTimeout(this.requestTimeout);
config.timeout =
maxTimeToFirstByteMs && Number.isFinite(maxTimeToFirstByteMs)
? maxTimeToFirstByteMs
: maxLoadTimeMs;
this.requestTimeout = self.setTimeout(
this.loadtimeout.bind(this),
config.timeout
@ -134,7 +157,7 @@ class XhrLoader implements Loader<LoaderContext> {
xhr.send();
}
readystatechange(): void {
readystatechange() {
const { context, loader: xhr, stats } = this;
if (!context || !xhr) {
return;
@ -149,41 +172,45 @@ class XhrLoader implements Loader<LoaderContext> {
// >= HEADERS_RECEIVED
if (readyState >= 2) {
// clear xhr timeout and rearm it if readyState less than 4
self.clearTimeout(this.requestTimeout);
if (stats.loading.first === 0) {
stats.loading.first = Math.max(
self.performance.now(),
stats.loading.start
);
// readyState >= 2 AND readyState !==4 (readyState = HEADERS_RECEIVED || LOADING) rearm timeout as xhr not finished yet
if (config.timeout !== config.loadPolicy.maxLoadTimeMs) {
self.clearTimeout(this.requestTimeout);
config.timeout = config.loadPolicy.maxLoadTimeMs;
this.requestTimeout = self.setTimeout(
this.loadtimeout.bind(this),
config.loadPolicy.maxLoadTimeMs -
(stats.loading.first - stats.loading.start)
);
}
}
if (readyState === 4) {
self.clearTimeout(this.requestTimeout);
xhr.onreadystatechange = null;
xhr.onprogress = null;
const status = xhr.status;
// http status between 200 to 299 are all successful
const isArrayBuffer = xhr.responseType === 'arraybuffer';
const useResponse = xhr.responseType !== 'text';
if (
status >= 200 &&
status < 300 &&
((isArrayBuffer && xhr.response) || xhr.responseText !== null)
((useResponse && xhr.response) || xhr.responseText !== null)
) {
stats.loading.end = Math.max(
self.performance.now(),
stats.loading.first
);
let data;
let len: number;
if (isArrayBuffer) {
data = xhr.response;
len = data.byteLength;
} else {
data = xhr.responseText;
len = data.length;
}
const data = useResponse ? xhr.response : xhr.responseText;
const len =
xhr.responseType === 'arraybuffer' ? data.byteLength : data.length;
stats.loaded = stats.total = len;
stats.bwEstimate =
(stats.total * 8000) / (stats.loading.end - stats.loading.first);
if (!this.callbacks) {
return;
}
@ -194,67 +221,71 @@ class XhrLoader implements Loader<LoaderContext> {
if (!this.callbacks) {
return;
}
const response = {
const response: LoaderResponse = {
url: xhr.responseURL,
data: data,
code: status,
};
this.callbacks.onSuccess(response, stats, context, xhr);
} else {
const retryConfig = config.loadPolicy.errorRetry;
const retryCount = stats.retry;
// if max nb of retries reached or if http status between 400 and 499 (such error cannot be recovered, retrying is useless), return error
if (
stats.retry >= config.maxRetry ||
(status >= 400 && status < 499)
) {
if (shouldRetry(retryConfig, retryCount, false, status)) {
this.retry(retryConfig);
} else {
logger.error(`${status} while loading ${context.url}`);
this.callbacks!.onError(
{ code: status, text: xhr.statusText },
context,
xhr
xhr,
stats
);
} else {
// retry
logger.warn(
`${status} while loading ${context.url}, retrying in ${this.retryDelay}...`
);
// abort and reset internal state
this.abortInternal();
this.loader = null;
// schedule retry
self.clearTimeout(this.retryTimeout);
this.retryTimeout = self.setTimeout(
this.loadInternal.bind(this),
this.retryDelay
);
// set exponential backoff
this.retryDelay = Math.min(
2 * this.retryDelay,
config.maxRetryDelay
);
stats.retry++;
}
}
} else {
// readyState >= 2 AND readyState !==4 (readyState = HEADERS_RECEIVED || LOADING) rearm timeout as xhr not finished yet
self.clearTimeout(this.requestTimeout);
this.requestTimeout = self.setTimeout(
this.loadtimeout.bind(this),
config.timeout
);
}
}
}
loadtimeout(): void {
logger.warn(`timeout while loading ${this.context.url}`);
const callbacks = this.callbacks;
if (callbacks) {
this.abortInternal();
callbacks.onTimeout(this.stats, this.context, this.loader);
loadtimeout() {
const retryConfig = this.config?.loadPolicy.timeoutRetry;
const retryCount = this.stats.retry;
if (shouldRetry(retryConfig, retryCount, true)) {
this.retry(retryConfig);
} else {
logger.warn(`timeout while loading ${this.context.url}`);
const callbacks = this.callbacks;
if (callbacks) {
this.abortInternal();
callbacks.onTimeout(this.stats, this.context, this.loader);
}
}
}
loadprogress(event: ProgressEvent): void {
retry(retryConfig: RetryConfig) {
const { context, stats } = this;
this.retryDelay = getRetryDelay(retryConfig, stats.retry);
stats.retry++;
logger.warn(
`${status ? 'HTTP Status ' + status : 'Timeout'} while loading ${
context.url
}, retrying ${stats.retry}/${retryConfig.maxNumRetry} in ${
this.retryDelay
}ms`
);
// abort and reset internal state
this.abortInternal();
this.loader = null;
// schedule retry
self.clearTimeout(this.retryTimeout);
this.retryTimeout = self.setTimeout(
this.loadInternal.bind(this),
this.retryDelay
);
}
loadprogress(event: ProgressEvent) {
const stats = this.stats;
stats.loaded = event.loaded;
@ -274,6 +305,18 @@ class XhrLoader implements Loader<LoaderContext> {
}
return result;
}
getResponseHeader(name: string): string | null {
if (
this.loader &&
new RegExp(`^${name}:\\s*[\\d.]+\\s*$`, 'im').test(
this.loader.getAllResponseHeaders()
)
) {
return this.loader.getResponseHeader(name);
}
return null;
}
}
export default XhrLoader;