1
0
Fork 0
mirror of https://github.com/DanielnetoDotCom/YouPHPTube synced 2025-10-03 17:59:55 +02:00
Daniel Neto 2023-10-25 10:14:46 -03:00
parent b6d47e94c8
commit 65f15c7e46
2882 changed files with 382239 additions and 10785 deletions

View file

@ -1,3 +1,132 @@
<a name="3.6.0"></a>
# [3.6.0](https://github.com/videojs/http-streaming/compare/v3.5.3...v3.6.0) (2023-09-25)
### Features
* Add feature flag to calculate timestampOffset for each segment to handle streams with corrupted pts or dts timestamps ([#1426](https://github.com/videojs/http-streaming/issues/1426)) ([2355ddc](https://github.com/videojs/http-streaming/commit/2355ddc))
* content steering demo page tab ([#1425](https://github.com/videojs/http-streaming/issues/1425)) ([04451d4](https://github.com/videojs/http-streaming/commit/04451d4))
* request Content Steering manifest ([#1419](https://github.com/videojs/http-streaming/issues/1419)) ([86d5327](https://github.com/videojs/http-streaming/commit/86d5327))
### Chores
* update mpd-parser to v1.2.1 ([#1420](https://github.com/videojs/http-streaming/issues/1420)) ([c246ca1](https://github.com/videojs/http-streaming/commit/c246ca1))
* update mpd-parser to v1.2.2 ([#1422](https://github.com/videojs/http-streaming/issues/1422)) ([9ab8c88](https://github.com/videojs/http-streaming/commit/9ab8c88))
<a name="3.5.3"></a>
## [3.5.3](https://github.com/videojs/http-streaming/compare/v3.5.2...v3.5.3) (2023-08-14)
### Bug Fixes
* demo page representation selector ([#1416](https://github.com/videojs/http-streaming/issues/1416)) ([4ca3cab](https://github.com/videojs/http-streaming/commit/4ca3cab))
* fastQualityChange refactor ([#1414](https://github.com/videojs/http-streaming/issues/1414)) ([4590bdd](https://github.com/videojs/http-streaming/commit/4590bdd))
* reduce playlist exclusion defaults ([#1413](https://github.com/videojs/http-streaming/issues/1413)) ([bf0a300](https://github.com/videojs/http-streaming/commit/bf0a300))
* remove segment loader abort in setCurrentTime ([#1415](https://github.com/videojs/http-streaming/issues/1415)) ([323bb32](https://github.com/videojs/http-streaming/commit/323bb32))
<a name="3.5.2"></a>
## [3.5.2](https://github.com/videojs/http-streaming/compare/v3.5.1...v3.5.2) (2023-08-07)
### Bug Fixes
* restore dateTimeObject and dateTimeString usage ([#1412](https://github.com/videojs/http-streaming/issues/1412)) ([5e425c0](https://github.com/videojs/http-streaming/commit/5e425c0))
<a name="3.5.1"></a>
## [3.5.1](https://github.com/videojs/http-streaming/compare/v3.5.0...v3.5.1) (2023-07-26)
### Bug Fixes
* **live:** cue end time not finite ([#1411](https://github.com/videojs/http-streaming/issues/1411)) ([9723e6d](https://github.com/videojs/http-streaming/commit/9723e6d))
<a name="3.5.0"></a>
# [3.5.0](https://github.com/videojs/http-streaming/compare/v3.4.0...v3.5.0) (2023-07-25)
### Features
* add daterange support ([#1402](https://github.com/videojs/http-streaming/issues/1402)) ([7c0e968](https://github.com/videojs/http-streaming/commit/7c0e968))
* enable caption positioning ([#1408](https://github.com/videojs/http-streaming/issues/1408)) ([3c5a5bc](https://github.com/videojs/http-streaming/commit/3c5a5bc))
### Bug Fixes
* **live:** only reset playlist loader for LLHLS ([#1410](https://github.com/videojs/http-streaming/issues/1410)) ([0d8a7a3](https://github.com/videojs/http-streaming/commit/0d8a7a3))
### Chores
* update m3u8-parser version and fix tests ([#1407](https://github.com/videojs/http-streaming/issues/1407)) ([fe25a04](https://github.com/videojs/http-streaming/commit/fe25a04))
### Code Refactoring
* add dateTimeObject and dateTimeString to Cues for backward compatibility ([#1409](https://github.com/videojs/http-streaming/issues/1409)) ([2079454](https://github.com/videojs/http-streaming/commit/2079454))
<a name="3.4.0"></a>
# [3.4.0](https://github.com/videojs/http-streaming/compare/v3.3.1...v3.4.0) (2023-06-01)
### Features
* add support for independentSegments ([#1399](https://github.com/videojs/http-streaming/issues/1399)) ([f9a392f](https://github.com/videojs/http-streaming/commit/f9a392f))
### Bug Fixes
* **llhls:** watcher causes playback failure ([#1398](https://github.com/videojs/http-streaming/issues/1398)) ([3580d1e](https://github.com/videojs/http-streaming/commit/3580d1e))
### Chores
* **package:** update m3u8-parser v6.2.0 ([#1403](https://github.com/videojs/http-streaming/issues/1403)) ([0026717](https://github.com/videojs/http-streaming/commit/0026717))
<a name="3.3.1"></a>
## [3.3.1](https://github.com/videojs/http-streaming/compare/v3.3.0...v3.3.1) (2023-05-15)
### Bug Fixes
* onRequest hooks called too late ([#1396](https://github.com/videojs/http-streaming/issues/1396)) ([19539ea](https://github.com/videojs/http-streaming/commit/19539ea))
### Chores
* Update CI and release workflows ([#1397](https://github.com/videojs/http-streaming/issues/1397)) ([12b378a](https://github.com/videojs/http-streaming/commit/12b378a))
<a name="3.3.0"></a>
# [3.3.0](https://github.com/videojs/http-streaming/compare/v3.2.0...v3.3.0) (2023-05-03)
### Features
* Start at offset from EXT-X-START ([#1389](https://github.com/videojs/http-streaming/issues/1389)) ([b3a508d](https://github.com/videojs/http-streaming/commit/b3a508d))
* **xhr:** add request and response hook API ([#1393](https://github.com/videojs/http-streaming/issues/1393)) ([2356c34](https://github.com/videojs/http-streaming/commit/2356c34))
<a name="3.2.0"></a>
# [3.2.0](https://github.com/videojs/http-streaming/compare/v3.1.0...v3.2.0) (2023-04-04)
### Features
* add an option to support forced subtitles ([#1329](https://github.com/videojs/http-streaming/issues/1329)) ([6bd98d0](https://github.com/videojs/http-streaming/commit/6bd98d0))
* add event stream support ([#1382](https://github.com/videojs/http-streaming/issues/1382)) ([f6b9498](https://github.com/videojs/http-streaming/commit/f6b9498))
* Remove remnants of IE and old Edge ([#1343](https://github.com/videojs/http-streaming/issues/1343)) ([93a2bfd](https://github.com/videojs/http-streaming/commit/93a2bfd))
### Bug Fixes
* allow audio fmp4 emsg probe ([#1385](https://github.com/videojs/http-streaming/issues/1385)) ([c90863c](https://github.com/videojs/http-streaming/commit/c90863c))
* **docs:** Remove confusion around including VHS separately ([#1367](https://github.com/videojs/http-streaming/issues/1367)) ([b4f44e4](https://github.com/videojs/http-streaming/commit/b4f44e4))
* error on undefined metadata frames ([#1383](https://github.com/videojs/http-streaming/issues/1383)) ([d258fae](https://github.com/videojs/http-streaming/commit/d258fae))
* use audio offset for id3 with audio-only ([#1386](https://github.com/videojs/http-streaming/issues/1386)) ([e6d8b08](https://github.com/videojs/http-streaming/commit/e6d8b08))
### Chores
* **package:** update dependencies to de-dupe m3u8-parser in the tree ([#1388](https://github.com/videojs/http-streaming/issues/1388)) ([369ee66](https://github.com/videojs/http-streaming/commit/369ee66))
* update mpd-parser to 1.1.0 ([#1384](https://github.com/videojs/http-streaming/issues/1384)) ([915bdee](https://github.com/videojs/http-streaming/commit/915bdee))
* update mpd-parser to 1.1.1 ([#1387](https://github.com/videojs/http-streaming/issues/1387)) ([9520070](https://github.com/videojs/http-streaming/commit/9520070))
### Code Refactoring
* remove nested loop from removeDuplicateCuesFromTrack function ([#1381](https://github.com/videojs/http-streaming/issues/1381)) ([12acbdd](https://github.com/videojs/http-streaming/commit/12acbdd))
<a name="3.1.0"></a>
# [3.1.0](https://github.com/videojs/http-streaming/compare/v3.0.2...v3.1.0) (2023-03-07)
### Features
* add fmp4 emsg ID3 support ([#1370](https://github.com/videojs/http-streaming/issues/1370)) ([906f29e](https://github.com/videojs/http-streaming/commit/906f29e))
### Chores
* npm publish for release workflow ([#1376](https://github.com/videojs/http-streaming/issues/1376)) ([e5b4bf6](https://github.com/videojs/http-streaming/commit/e5b4bf6))
<a name="3.0.2"></a>
## [3.0.2](https://github.com/videojs/http-streaming/compare/v3.0.1...v3.0.2) (2023-02-27)

View file

@ -9,7 +9,7 @@
Play HLS, DASH, and future HTTP streaming protocols with video.js, even where they're not
natively supported.
Included in video.js 7 by default! See the [video.js 7 blog post](https://blog.videojs.com/video-js-7-is-here/)
**Included in video.js 7 by default!** See the [video.js 7 blog post](https://blog.videojs.com/video-js-7-is-here/)
Maintenance Status: Stable
@ -30,7 +30,6 @@ Video.js Compatibility: 7.x, 8.x
- [Compatibility](#compatibility)
- [Browsers which support MSE](#browsers-which-support-mse)
- [Native only](#native-only)
- [Flash Support](#flash-support)
- [DRM](#drm)
- [Documentation](#documentation)
- [Options](#options)
@ -57,6 +56,7 @@ Video.js Compatibility: 7.x, 8.x
- [liveRangeSafeTimeDelta](#liverangesafetimedelta)
- [useNetworkInformationApi](#usenetworkinformationapi)
- [useDtsForTimestampOffset](#usedtsfortimestampoffset)
- [useForcedSubtitles](#useforcedsubtitles)
- [captionServices](#captionservices)
- [Format](#format)
- [Example](#example)
@ -72,6 +72,7 @@ Video.js Compatibility: 7.x, 8.x
- [vhs.stats](#vhsstats)
- [Events](#events)
- [loadedmetadata](#loadedmetadata)
- [xhr-hooks-ready](#xhr-hooks-ready)
- [VHS Usage Events](#vhs-usage-events)
- [Presence Stats](#presence-stats)
- [Use Stats](#use-stats)
@ -94,8 +95,13 @@ Video.js Compatibility: 7.x, 8.x
<!-- END doctoc generated TOC please keep comment here to allow auto update -->
## Installation
In most cases **it is not necessary to separately install http-streaming**, as it has been included in the default build of Video.js since version 7.
Only install if you need a specifc combination of video.js and http-streaming versions. If installing separately, use the "core" version of Video.js without the bundled version of http-streaming.
### NPM
To install `videojs-http-streaming` with npm run
To install `videojs-http-streaming` with npm, run
```bash
npm install --save @videojs/http-streaming
@ -117,11 +123,12 @@ See [CONTRIBUTING.md](/CONTRIBUTING.md)
See [our troubleshooting guide](/docs/troubleshooting.md)
## Talk to us
Drop by our slack channel (#playback) on the [Video.js slack][slack-link].
Drop by the [Video.js slack][slack-link].
## Getting Started
This library is included in video.js 7 by default, if you are using an older version of video.js then
get a copy of [videojs-http-streaming](#installation) and include it in your page along with video.js:
This library is included in Video.js 7 by default.
**Only if need a specific combination of versions of Video.js and VHS** you can get a copy of [videojs-http-streaming](#installation) and include it in your page along with video.js. In this case, you should use the "core" build of Video.js, without a bundled VHS:
```html
<video-js id=vid1 width=600 height=300 class="vjs-default-skin" controls>
@ -129,7 +136,8 @@ get a copy of [videojs-http-streaming](#installation) and include it in your pag
src="https://example.com/index.m3u8"
type="application/x-mpegURL">
</video-js>
<script src="video.js"></script>
<!-- "core" version of Video.js -->
<script src="video.core.min.js"></script>
<script src="videojs-http-streaming.min.js"></script>
<script>
var player = videojs('vid1');
@ -137,8 +145,6 @@ player.play();
</script>
```
Check out our [live example](https://jsbin.com/gejugat/edit?html,output) if you're having trouble.
Is it recommended to use the `<video-js>` element or load a source with `player.src(sourceObject)` in order to prevent the video element from playing the source natively where HLS is supported.
## Compatibility
@ -151,7 +157,7 @@ The [Media Source Extensions](http://caniuse.com/#feat=mediasource) API is requi
- Firefox
- Internet Explorer 11 Windows 10 or 8.1
These browsers have some level of native HLS support, which will be used unless the [overrideNative](#overridenative) option is used:
These browsers have some level of native HLS support, however by default the [overrideNative](#overridenative) option is set to `true` except on Safari, so MSE playback is used:
- Chrome Android
- Firefox Android
@ -162,11 +168,7 @@ These browsers have some level of native HLS support, which will be used unless
- Mac Safari
- iOS Safari
Mac Safari does have MSE support, but native HLS is recommended
### Flash Support
This plugin does not support Flash playback. Instead, it is recommended that users use the [videojs-flashls-source-handler](https://github.com/brightcove/videojs-flashls-source-handler) plugin as a fallback option for browsers that don't have a native
[HLS](https://caniuse.com/#feat=http-live-streaming)/[DASH](https://caniuse.com/#feat=mpeg-dash) player or support for [Media Source Extensions](http://caniuse.com/#feat=mediasource).
Mac and iPad Safari do have MSE support, but native HLS is recommended
### DRM
@ -461,6 +463,20 @@ This option defaults to `false`.
* Default: `false`
* Use [Decode Timestamp](https://www.w3.org/TR/media-source/#decode-timestamp) instead of [Presentation Timestamp](https://www.w3.org/TR/media-source/#presentation-timestamp) for [timestampOffset](https://www.w3.org/TR/media-source/#dom-sourcebuffer-timestampoffset) calculation. This option was introduced to align with DTS-based browsers. This option affects only transmuxed data (eg: transport stream). For more info please check the following [issue](https://github.com/videojs/http-streaming/issues/1247).
##### calculateTimestampOffsetForEachSegment
* Type: `boolean`,
* Default: `false`
* Calculate timestampOffset for each segment, regardless of its timeline. Sometimes it is helpful when you have corrupted DTS/PTS timestamps during discontinuities.
##### useForcedSubtitles
* Type: `boolean`
* Default: `false`
* can be used as a source option
* can be used as an initialization option
If true, this option allows the player to display forced subtitles. When available, forced subtitles allow to translate foreign language dialogues or images containing foreign language characters.
##### captionServices
* Type: `object`
* Default: undefined
@ -627,45 +643,133 @@ player.tech().vhs.representations().forEach(function(rep) {
#### vhs.xhr
Type: `function`
The xhr function that is used by HLS internally is exposed on the per-
The xhr function that is used by VHS internally is exposed on the per-
player `vhs` object. While it is possible, we do not recommend replacing
the function with your own implementation. Instead, the `xhr` provides
the ability to specify a `beforeRequest` function that will be called
with an object containing the options that will be used to create the
xhr request.
the function with your own implementation. Instead, `xhr` provides
the ability to specify `onRequest` and `onResponse` hooks which each take a
callback function as a parameter, as well as `offRequest` and `offResponse`
functions which can remove a callback function from the `onRequest` or
`onResponse` Set. An `xhr-hooks-ready` event is fired from a player when per-player
hooks are ready to be added or removed. This will ensure player specific hooks are
set prior to any manifest or segment requests.
The `onRequest(callback)` function takes a `callback` function that will pass an xhr `options`
Object to that callback. These callbacks are called synchronously, in the order registered
and act as pre-request hooks for modifying the xhr `options` Object prior to making a request.
Note: This callback *MUST* return an `options` Object as the `xhr` wrapper and each `onRequest`
hook receives the returned `options` as a parameter.
Example:
```javascript
player.tech().vhs.xhr.beforeRequest = function(options) {
options.uri = options.uri.replace('example.com', 'foo.com');
return options;
};
player.on('xhr-hooks-ready', () => {
const playerRequestHook = (options) => {
return {
uri: 'https://new.options.uri'
};
};
player.tech().vhs.xhr.onRequest(playerRequestHook);
});
```
The global `videojs.Vhs` also exposes an `xhr` property. Specifying a
`beforeRequest` function on that will allow you to intercept the options
for *all* requests in every player on a page. For consistency across
browsers the video source should be set at runtime once the video player
is ready.
If access to the `xhr` Object is required prior to the `xhr.send` call, an `options.beforeSend`
callback can be set within an `onRequest` callback function that will pass the `xhr` Object
as a parameter and will be called immediately prior to `xhr.send`.
Example
Example:
```javascript
videojs.Vhs.xhr.beforeRequest = function(options) {
/*
* Modifications to requests that will affect every player.
*/
player.on('xhr-hooks-ready', () => {
const playerXhrRequestHook = (options) => {
options.beforeSend = (xhr) => {
xhr.setRequestHeader('foo', 'bar');
};
return options;
};
player.tech().vhs.xhr.onRequest(playerXhrRequestHook);
});
```
The `onResponse(callback)` function takes a `callback` function that will pass the xhr
`request`, `error`, and `response` Objects to that callback. These callbacks are called
in the order registered and act as post-request hooks for gathering data from the
xhr `request`, `error` and `response` Objects. `onResponse` callbacks do not require a
return value, the parameters are passed to each subsequent callback by reference.
Example:
```javascript
player.on('xhr-hooks-ready', () => {
const playerResponseHook = (request, error, response) => {
const bar = response.headers.foo;
};
player.tech().vhs.xhr.onResponse(playerResponseHook);
});
```
The `offRequest` function takes a `callback` function, and will remove that function from
the collection of `onRequest` hooks if it exists.
Example:
```javascript
player.on('xhr-hooks-ready', () => {
player.tech().vhs.xhr.offRequest(playerRequestHook);
});
```
The `offResponse` function takes a `callback` function, and will remove that function from
the collection of `offResponse` hooks if it exists.
Example:
```javascript
player.on('xhr-hooks-ready', () => {
player.tech().vhs.xhr.offResponse(playerResponseHook);
});
```
The global `videojs.Vhs` also exposes an `xhr` property. Adding `onRequest`
and/or `onResponse` hooks will allow you to intercept the request options and xhr
Object as well as request, error, and response data for *all* requests in *every*
player on a page. For consistency across browsers the video source should be set
at runtime once the video player is ready.
Example:
```javascript
// Global request callback, will affect every player.
const globalRequestHook = (options) => {
return {
uri: 'https://new.options.global.uri'
};
};
videojs.Vhs.xhr.onRequest(globalRequestHook);
```
```javascript
// Global request callback defining beforeSend function, will affect every player.
const globalXhrRequestHook = (options) => {
options.beforeSend = (xhr) => {
xhr.setRequestHeader('foo', 'bar');
};
return options;
};
videojs.Vhs.xhr.onRequest(globalXhrRequestHook);
```
var player = videojs('video-player-id');
player.ready(function() {
this.src({
src: 'https://d2zihajmogu5jn.cloudfront.net/bipbop-advanced/bipbop_16x9_variant.m3u8',
type: 'application/x-mpegURL',
});
});
```javascript
// Global response hook callback, will affect every player.
const globalResponseHook = (request, error, response) => {
const bar = response.headers.foo
};
videojs.Vhs.xhr.onResponse(globalResponseHook);
```
```javascript
// Remove a global onRequest callback.
videojs.Vhs.xhr.offRequest(globalRequestHook);
```
```javascript
// Remove a global onResponse callback.
videojs.Vhs.xhr.offResponse(globalResponseHook);
```
For information on the type of options that you can modify see the
@ -707,6 +811,11 @@ are triggered on the player object.
Fired after the first segment is downloaded for a playlist. This will not happen
until playback if video.js's `metadata` setting is `none`
#### xhr-hooks-ready
Fired when the player `xhr` object is ready to set `onRequest` and `onResponse` hooks, as well
as remove hooks with `offRequest` and `offResponse`.
### VHS Usage Events
Usage tracking events are fired when we detect a certain HLS feature, encoding setting,

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

File diff suppressed because one or more lines are too long

View file

@ -64,6 +64,7 @@ not meant serve as an exhaustive list.
* In-manifest [WebVTT] subtitles are automatically translated into standard HTML5 subtitle
tracks
* [AES-128] segment encryption
* DASH In-manifest EventStream and Event tags are automatically translated into HTML5 metadata cues
## Notable Missing Features
@ -114,8 +115,6 @@ not yet been implemented. VHS currently supports everything in the
* [EXT-X-DATERANGE]
* [EXT-X-SESSION-DATA]
* [EXT-X-SESSION-KEY]
* [EXT-X-INDEPENDENT-SEGMENTS]
* Use of [EXT-X-START] (value parsed but not used)
* Alternate video via [EXT-X-MEDIA] of type video
* ASSOC-LANGUAGE in [EXT-X-MEDIA]
* CHANNELS in [EXT-X-MEDIA]
@ -278,7 +277,6 @@ simply take on their default values (in the case where they have valid defaults)
[EXT-X-I-FRAMES-ONLY]: https://tools.ietf.org/html/draft-pantos-http-live-streaming-23#section-4.3.3.6
[EXT-X-I-FRAME-STREAM-INF]: https://tools.ietf.org/html/draft-pantos-http-live-streaming-23#section-4.3.4.3
[EXT-X-SESSION-KEY]: https://tools.ietf.org/html/draft-pantos-http-live-streaming-23#section-4.3.4.5
[EXT-X-INDEPENDENT-SEGMENTS]: https://tools.ietf.org/html/draft-pantos-http-live-streaming-23#section-4.3.5.1
[EXT-X-START]: https://tools.ietf.org/html/draft-pantos-http-live-streaming-23#section-4.3.5.2
[EXT-X-MEDIA]: https://tools.ietf.org/html/draft-pantos-http-live-streaming-23#section-4.3.4.1

View file

@ -49,6 +49,9 @@
<li class="nav-item" role="presentation">
<button class="nav-link" id="profile-tab" data-bs-toggle="tab" data-bs-target="#player-stats" type="button" role="tab" aria-selected="false">Player Stats</button>
</li>
<li class="nav-item" role="presentation">
<button class="nav-link" id="profile-tab" data-bs-toggle="tab" data-bs-target="#content-steering" type="button" role="tab" aria-selected="false">Content Steering</button>
</li>
</ul>
<div class="tab-content container-fluid">
@ -141,6 +144,11 @@
<label class="form-check-label" for="dts-offset">Use DTS instead of PTS for Timestamp Offset calculation (reloads player)</label>
</div>
<div class="form-check">
<input id=offset-each-segment type="checkbox" class="form-check-input">
<label class="form-check-label" for="offset-each-segment">Calculate timestampOffset for each segment, regardless of its timeline (reloads player)</label>
</div>
<div class="form-check">
<input id=llhls type="checkbox" class="form-check-input">
<label class="form-check-label" for="llhls">[EXPERIMENTAL] Enables support for ll-hls (reloads player)</label>
@ -171,6 +179,11 @@
<label class="form-check-label" for="mirror-source">Mirror sources from player.src (reloads player, uses EXPERIMENTAL sourceset option)</label>
</div>
<div class="form-check">
<input id="forced-subtitles" type="checkbox" class="form-check-input">
<label class="form-check-label" for="forced-subtitles">Use Forced Subtitles (reloads player)</label>
</div>
<div class="input-group">
<span class="input-group-text"><label for=preload>Preload (reloads player)</label></span>
<select id=preload class="form-select">
@ -216,6 +229,22 @@
<ul id="segment-metadata" class="col-8"></ul>
</div>
</div>
<div class="tab-pane" id="content-steering" role="tabpanel">
<div class="row">
<div class="content-steering col-8">
<dl>
<dt>Current Pathway:</dt>
<dd class="current-pathway"></dd>
<dt>Available Pathways:</dt>
<dd class="available-pathways"></dd>
<dt>Steering Manifest:</dt>
<dd class="steering-manifest"></dd>
</dl>
</div>
</div>
</div>
</div>
</div>
<footer class="text-center p-3" id=unit-test-link>

View file

@ -0,0 +1,12 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
exec "$basedir/node" "$basedir/../mux.js/bin/transmux.js" "$@"
else
exec node "$basedir/../mux.js/bin/transmux.js" "$@"
fi

View file

@ -0,0 +1,17 @@
@ECHO off
GOTO start
:find_dp0
SET dp0=%~dp0
EXIT /b
:start
SETLOCAL
CALL :find_dp0
IF EXIST "%dp0%\node.exe" (
SET "_prog=%dp0%\node.exe"
) ELSE (
SET "_prog=node"
SET PATHEXT=%PATHEXT:;.JS;=;%
)
endLocal & goto #_undefined_# 2>NUL || title %COMSPEC% & "%_prog%" "%dp0%\..\mux.js\bin\transmux.js" %*

View file

@ -0,0 +1,28 @@
#!/usr/bin/env pwsh
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
$exe=""
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
# Fix case when both the Windows and Linux builds of Node
# are installed in the same directory
$exe=".exe"
}
$ret=0
if (Test-Path "$basedir/node$exe") {
# Support pipeline input
if ($MyInvocation.ExpectingInput) {
$input | & "$basedir/node$exe" "$basedir/../mux.js/bin/transmux.js" $args
} else {
& "$basedir/node$exe" "$basedir/../mux.js/bin/transmux.js" $args
}
$ret=$LASTEXITCODE
} else {
# Support pipeline input
if ($MyInvocation.ExpectingInput) {
$input | & "node$exe" "$basedir/../mux.js/bin/transmux.js" $args
} else {
& "node$exe" "$basedir/../mux.js/bin/transmux.js" $args
}
$ret=$LASTEXITCODE
}
exit $ret

View file

@ -1,3 +1,36 @@
<a name="7.1.0"></a>
# [7.1.0](https://github.com/videojs/m3u8-parser/compare/v7.0.0...v7.1.0) (2023-08-07)
### Features
* parse content steering tags and attributes ([#176](https://github.com/videojs/m3u8-parser/issues/176)) ([42472c5](https://github.com/videojs/m3u8-parser/commit/42472c5))
### Bug Fixes
* add dateTimeObject and dateTimeString for backward compatibility ([#174](https://github.com/videojs/m3u8-parser/issues/174)) ([6944bb1](https://github.com/videojs/m3u8-parser/commit/6944bb1))
* merge dateRange tags with same IDs and no conflicting attributes ([#175](https://github.com/videojs/m3u8-parser/issues/175)) ([73d934c](https://github.com/videojs/m3u8-parser/commit/73d934c))
### Chores
* update v7.0.0 documentation ([#172](https://github.com/videojs/m3u8-parser/issues/172)) ([72da994](https://github.com/videojs/m3u8-parser/commit/72da994))
<a name="7.0.0"></a>
# [7.0.0](https://github.com/videojs/m3u8-parser/compare/v6.2.0...v7.0.0) (2023-07-10)
### Features
* Add PDT to each segment ([#168](https://github.com/videojs/m3u8-parser/issues/168)) ([e7c683f](https://github.com/videojs/m3u8-parser/commit/e7c683f))
* output segment title from EXTINF ([#158](https://github.com/videojs/m3u8-parser/issues/158)) ([4adaa2c](https://github.com/videojs/m3u8-parser/commit/4adaa2c))
### Documentation
* correct `customType` option name ([#147](https://github.com/videojs/m3u8-parser/issues/147)) ([4d3e6ce](https://github.com/videojs/m3u8-parser/commit/4d3e6ce))
### BREAKING CHANGES
* rename `daterange` to `dateRanges`
* remove `dateTimeObject` and `dateTimeString` from parsed segment and replaces it with `programDateTime` which represents the timestamp in milliseconds
<a name="6.2.0"></a>
# [6.2.0](https://github.com/videojs/m3u8-parser/compare/v6.1.0...v6.2.0) (2023-05-25)

View file

@ -57,6 +57,7 @@ var manifest = [
'0.ts',
'#EXTINF:6,',
'1.ts',
'#EXT-X-PROGRAM-DATE-TIME:2019-02-14T02:14:00.106Z'
'#EXTINF:6,',
'2.ts',
'#EXT-X-ENDLIST'
@ -79,6 +80,7 @@ Manifest {
allowCache: boolean,
endList: boolean,
mediaSequence: number,
dateRanges: [],
discontinuitySequence: number,
playlistType: string,
custom: {},
@ -113,11 +115,13 @@ Manifest {
discontinuityStarts: [number],
segments: [
{
title: string,
byterange: {
length: number,
offset: number
},
duration: number,
programDateTime: number,
attributes: {},
discontinuity: number,
uri: string,

View file

@ -1,4 +1,4 @@
/*! @name m3u8-parser @version 6.2.0 @license Apache-2.0 */
/*! @name m3u8-parser @version 7.1.0 @license Apache-2.0 */
'use strict';
Object.defineProperty(exports, '__esModule', { value: true });
@ -748,6 +748,18 @@ class ParseStream extends Stream__default["default"] {
tagType: 'independent-segments'
});
return;
}
match = /^#EXT-X-CONTENT-STEERING:(.*)$/.exec(newLine);
if (match) {
event = {
type: 'tag',
tagType: 'content-steering'
};
event.attributes = parseAttributes(match[1]);
this.trigger('data', event);
return;
} // unknown tag type
@ -908,6 +920,7 @@ class Parser extends Stream__default["default"] {
this.lineStream = new LineStream();
this.parseStream = new ParseStream();
this.lineStream.pipe(this.parseStream);
this.lastProgramDateTime = null;
/* eslint-disable consistent-this */
const self = this;
@ -938,6 +951,7 @@ class Parser extends Stream__default["default"] {
this.manifest = {
allowCache: true,
discontinuityStarts: [],
dateRanges: [],
segments: []
}; // keep track of the last seen segment's byte range end, as segments are not required
// to provide the offset, in which case it defaults to the next byte after the
@ -946,7 +960,7 @@ class Parser extends Stream__default["default"] {
let lastByterangeEnd = 0; // keep track of the last seen part's byte range end.
let lastPartByterangeEnd = 0;
const daterangeTags = {};
const dateRangeTags = {};
this.on('end', () => {
// only add preloadSegment if we don't yet have a uri for it.
// and we actually have parts/preloadHints
@ -1042,6 +1056,10 @@ class Parser extends Stream__default["default"] {
});
}
if (entry.title) {
currentUri.title = entry.title;
}
if (entry.duration > 0) {
currentUri.duration = entry.duration;
}
@ -1294,6 +1312,21 @@ class Parser extends Stream__default["default"] {
currentUri.dateTimeString = entry.dateTimeString;
currentUri.dateTimeObject = entry.dateTimeObject;
const {
lastProgramDateTime
} = this;
this.lastProgramDateTime = new Date(entry.dateTimeString).getTime(); // We should extrapolate Program Date Time backward only during first program date time occurrence.
// Once we have at least one program date time point, we can always extrapolate it forward using lastProgramDateTime reference.
if (lastProgramDateTime === null) {
// Extrapolate Program Date Time backward
// Since it is first program date time occurrence we're assuming that
// all this.manifest.segments have no program date time info
this.manifest.segments.reduceRight((programDateTime, segment) => {
segment.programDateTime = programDateTime - segment.duration * 1000;
return segment.programDateTime;
}, this.lastProgramDateTime);
}
},
targetduration() {
@ -1457,72 +1490,77 @@ class Parser extends Stream__default["default"] {
},
'daterange'() {
this.manifest.daterange = this.manifest.daterange || [];
this.manifest.daterange.push(camelCaseKeys(entry.attributes));
const index = this.manifest.daterange.length - 1;
this.manifest.dateRanges.push(camelCaseKeys(entry.attributes));
const index = this.manifest.dateRanges.length - 1;
this.warnOnMissingAttributes_(`#EXT-X-DATERANGE #${index}`, entry.attributes, ['ID', 'START-DATE']);
const daterange = this.manifest.daterange[index];
const dateRange = this.manifest.dateRanges[index];
if (daterange.endDate && daterange.startDate && new Date(daterange.endDate) < new Date(daterange.startDate)) {
if (dateRange.endDate && dateRange.startDate && new Date(dateRange.endDate) < new Date(dateRange.startDate)) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE END-DATE must be equal to or later than the value of the START-DATE'
});
}
if (daterange.duration && daterange.duration < 0) {
if (dateRange.duration && dateRange.duration < 0) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE DURATION must not be negative'
});
}
if (daterange.plannedDuration && daterange.plannedDuration < 0) {
if (dateRange.plannedDuration && dateRange.plannedDuration < 0) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE PLANNED-DURATION must not be negative'
});
}
const endOnNextYes = !!daterange.endOnNext;
const endOnNextYes = !!dateRange.endOnNext;
if (endOnNextYes && !daterange.class) {
if (endOnNextYes && !dateRange.class) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE with an END-ON-NEXT=YES attribute must have a CLASS attribute'
});
}
if (endOnNextYes && (daterange.duration || daterange.endDate)) {
if (endOnNextYes && (dateRange.duration || dateRange.endDate)) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE with an END-ON-NEXT=YES attribute must not contain DURATION or END-DATE attributes'
});
}
if (daterange.duration && daterange.endDate) {
const startDate = daterange.startDate;
const newDateInSeconds = startDate.setSeconds(startDate.getSeconds() + daterange.duration);
this.manifest.daterange[index].endDate = new Date(newDateInSeconds);
if (dateRange.duration && dateRange.endDate) {
const startDate = dateRange.startDate;
const newDateInSeconds = startDate.getTime() + dateRange.duration * 1000;
this.manifest.dateRanges[index].endDate = new Date(newDateInSeconds);
}
if (daterange && !this.manifest.dateTimeString) {
this.trigger('warn', {
message: 'A playlist with EXT-X-DATERANGE tag must contain atleast one EXT-X-PROGRAM-DATE-TIME tag'
});
}
if (!daterangeTags[daterange.id]) {
daterangeTags[daterange.id] = daterange;
if (!dateRangeTags[dateRange.id]) {
dateRangeTags[dateRange.id] = dateRange;
} else {
for (const attribute in daterangeTags[daterange.id]) {
if (daterangeTags[daterange.id][attribute] !== daterange[attribute]) {
for (const attribute in dateRangeTags[dateRange.id]) {
if (!!dateRange[attribute] && JSON.stringify(dateRangeTags[dateRange.id][attribute]) !== JSON.stringify(dateRange[attribute])) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE tags with the same ID in a playlist must have the same attributes and same attribute values'
message: 'EXT-X-DATERANGE tags with the same ID in a playlist must have the same attributes values'
});
break;
}
}
} // if tags with the same ID do not have conflicting attributes, merge them
const dateRangeWithSameId = this.manifest.dateRanges.findIndex(dateRangeToFind => dateRangeToFind.id === dateRange.id);
this.manifest.dateRanges[dateRangeWithSameId] = _extends__default["default"](this.manifest.dateRanges[dateRangeWithSameId], dateRange);
dateRangeTags[dateRange.id] = _extends__default["default"](dateRangeTags[dateRange.id], dateRange); // after merging, delete the duplicate dateRange that was added last
this.manifest.dateRanges.pop();
}
},
'independent-segments'() {
this.manifest.independentSegments = true;
},
'content-steering'() {
this.manifest.contentSteering = camelCaseKeys(entry.attributes);
this.warnOnMissingAttributes_('#EXT-X-CONTENT-STEERING', entry.attributes, ['SERVER-URI']);
}
})[entry.tagType] || noop).call(self);
@ -1551,7 +1589,13 @@ class Parser extends Stream__default["default"] {
} // reset the last byterange end as it needs to be 0 between parts
lastPartByterangeEnd = 0; // prepare for the next URI
lastPartByterangeEnd = 0; // Once we have at least one program date time we can always extrapolate it forward
if (this.lastProgramDateTime !== null) {
currentUri.programDateTime = this.lastProgramDateTime;
this.lastProgramDateTime += currentUri.duration * 1000;
} // prepare for the next URI
currentUri = {};
},
@ -1608,6 +1652,14 @@ class Parser extends Stream__default["default"] {
end() {
// flush any buffered input
this.lineStream.push('\n');
if (this.manifest.dateRanges.length && this.lastProgramDateTime === null) {
this.trigger('warn', {
message: 'A playlist with EXT-X-DATERANGE tag must contain atleast one EXT-X-PROGRAM-DATE-TIME tag'
});
}
this.lastProgramDateTime = null;
this.trigger('end');
}
/**
@ -1615,7 +1667,7 @@ class Parser extends Stream__default["default"] {
*
* @param {Object} options a map of options for the added parser
* @param {RegExp} options.expression a regular expression to match the custom header
* @param {string} options.type the type to register to the output
* @param {string} options.customType the custom type to register to the output
* @param {Function} [options.dataParser] function to parse the line into an object
* @param {boolean} [options.segment] should tag data be attached to the segment object
*/

View file

@ -1,4 +1,4 @@
/*! @name m3u8-parser @version 6.2.0 @license Apache-2.0 */
/*! @name m3u8-parser @version 7.1.0 @license Apache-2.0 */
import Stream from '@videojs/vhs-utils/es/stream.js';
import _extends from '@babel/runtime/helpers/extends';
import decodeB64ToUint8Array from '@videojs/vhs-utils/es/decode-b64-to-uint8-array.js';
@ -738,6 +738,18 @@ class ParseStream extends Stream {
tagType: 'independent-segments'
});
return;
}
match = /^#EXT-X-CONTENT-STEERING:(.*)$/.exec(newLine);
if (match) {
event = {
type: 'tag',
tagType: 'content-steering'
};
event.attributes = parseAttributes(match[1]);
this.trigger('data', event);
return;
} // unknown tag type
@ -898,6 +910,7 @@ class Parser extends Stream {
this.lineStream = new LineStream();
this.parseStream = new ParseStream();
this.lineStream.pipe(this.parseStream);
this.lastProgramDateTime = null;
/* eslint-disable consistent-this */
const self = this;
@ -928,6 +941,7 @@ class Parser extends Stream {
this.manifest = {
allowCache: true,
discontinuityStarts: [],
dateRanges: [],
segments: []
}; // keep track of the last seen segment's byte range end, as segments are not required
// to provide the offset, in which case it defaults to the next byte after the
@ -936,7 +950,7 @@ class Parser extends Stream {
let lastByterangeEnd = 0; // keep track of the last seen part's byte range end.
let lastPartByterangeEnd = 0;
const daterangeTags = {};
const dateRangeTags = {};
this.on('end', () => {
// only add preloadSegment if we don't yet have a uri for it.
// and we actually have parts/preloadHints
@ -1032,6 +1046,10 @@ class Parser extends Stream {
});
}
if (entry.title) {
currentUri.title = entry.title;
}
if (entry.duration > 0) {
currentUri.duration = entry.duration;
}
@ -1284,6 +1302,21 @@ class Parser extends Stream {
currentUri.dateTimeString = entry.dateTimeString;
currentUri.dateTimeObject = entry.dateTimeObject;
const {
lastProgramDateTime
} = this;
this.lastProgramDateTime = new Date(entry.dateTimeString).getTime(); // We should extrapolate Program Date Time backward only during first program date time occurrence.
// Once we have at least one program date time point, we can always extrapolate it forward using lastProgramDateTime reference.
if (lastProgramDateTime === null) {
// Extrapolate Program Date Time backward
// Since it is first program date time occurrence we're assuming that
// all this.manifest.segments have no program date time info
this.manifest.segments.reduceRight((programDateTime, segment) => {
segment.programDateTime = programDateTime - segment.duration * 1000;
return segment.programDateTime;
}, this.lastProgramDateTime);
}
},
targetduration() {
@ -1447,72 +1480,77 @@ class Parser extends Stream {
},
'daterange'() {
this.manifest.daterange = this.manifest.daterange || [];
this.manifest.daterange.push(camelCaseKeys(entry.attributes));
const index = this.manifest.daterange.length - 1;
this.manifest.dateRanges.push(camelCaseKeys(entry.attributes));
const index = this.manifest.dateRanges.length - 1;
this.warnOnMissingAttributes_(`#EXT-X-DATERANGE #${index}`, entry.attributes, ['ID', 'START-DATE']);
const daterange = this.manifest.daterange[index];
const dateRange = this.manifest.dateRanges[index];
if (daterange.endDate && daterange.startDate && new Date(daterange.endDate) < new Date(daterange.startDate)) {
if (dateRange.endDate && dateRange.startDate && new Date(dateRange.endDate) < new Date(dateRange.startDate)) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE END-DATE must be equal to or later than the value of the START-DATE'
});
}
if (daterange.duration && daterange.duration < 0) {
if (dateRange.duration && dateRange.duration < 0) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE DURATION must not be negative'
});
}
if (daterange.plannedDuration && daterange.plannedDuration < 0) {
if (dateRange.plannedDuration && dateRange.plannedDuration < 0) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE PLANNED-DURATION must not be negative'
});
}
const endOnNextYes = !!daterange.endOnNext;
const endOnNextYes = !!dateRange.endOnNext;
if (endOnNextYes && !daterange.class) {
if (endOnNextYes && !dateRange.class) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE with an END-ON-NEXT=YES attribute must have a CLASS attribute'
});
}
if (endOnNextYes && (daterange.duration || daterange.endDate)) {
if (endOnNextYes && (dateRange.duration || dateRange.endDate)) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE with an END-ON-NEXT=YES attribute must not contain DURATION or END-DATE attributes'
});
}
if (daterange.duration && daterange.endDate) {
const startDate = daterange.startDate;
const newDateInSeconds = startDate.setSeconds(startDate.getSeconds() + daterange.duration);
this.manifest.daterange[index].endDate = new Date(newDateInSeconds);
if (dateRange.duration && dateRange.endDate) {
const startDate = dateRange.startDate;
const newDateInSeconds = startDate.getTime() + dateRange.duration * 1000;
this.manifest.dateRanges[index].endDate = new Date(newDateInSeconds);
}
if (daterange && !this.manifest.dateTimeString) {
this.trigger('warn', {
message: 'A playlist with EXT-X-DATERANGE tag must contain atleast one EXT-X-PROGRAM-DATE-TIME tag'
});
}
if (!daterangeTags[daterange.id]) {
daterangeTags[daterange.id] = daterange;
if (!dateRangeTags[dateRange.id]) {
dateRangeTags[dateRange.id] = dateRange;
} else {
for (const attribute in daterangeTags[daterange.id]) {
if (daterangeTags[daterange.id][attribute] !== daterange[attribute]) {
for (const attribute in dateRangeTags[dateRange.id]) {
if (!!dateRange[attribute] && JSON.stringify(dateRangeTags[dateRange.id][attribute]) !== JSON.stringify(dateRange[attribute])) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE tags with the same ID in a playlist must have the same attributes and same attribute values'
message: 'EXT-X-DATERANGE tags with the same ID in a playlist must have the same attributes values'
});
break;
}
}
} // if tags with the same ID do not have conflicting attributes, merge them
const dateRangeWithSameId = this.manifest.dateRanges.findIndex(dateRangeToFind => dateRangeToFind.id === dateRange.id);
this.manifest.dateRanges[dateRangeWithSameId] = _extends(this.manifest.dateRanges[dateRangeWithSameId], dateRange);
dateRangeTags[dateRange.id] = _extends(dateRangeTags[dateRange.id], dateRange); // after merging, delete the duplicate dateRange that was added last
this.manifest.dateRanges.pop();
}
},
'independent-segments'() {
this.manifest.independentSegments = true;
},
'content-steering'() {
this.manifest.contentSteering = camelCaseKeys(entry.attributes);
this.warnOnMissingAttributes_('#EXT-X-CONTENT-STEERING', entry.attributes, ['SERVER-URI']);
}
})[entry.tagType] || noop).call(self);
@ -1541,7 +1579,13 @@ class Parser extends Stream {
} // reset the last byterange end as it needs to be 0 between parts
lastPartByterangeEnd = 0; // prepare for the next URI
lastPartByterangeEnd = 0; // Once we have at least one program date time we can always extrapolate it forward
if (this.lastProgramDateTime !== null) {
currentUri.programDateTime = this.lastProgramDateTime;
this.lastProgramDateTime += currentUri.duration * 1000;
} // prepare for the next URI
currentUri = {};
},
@ -1598,6 +1642,14 @@ class Parser extends Stream {
end() {
// flush any buffered input
this.lineStream.push('\n');
if (this.manifest.dateRanges.length && this.lastProgramDateTime === null) {
this.trigger('warn', {
message: 'A playlist with EXT-X-DATERANGE tag must contain atleast one EXT-X-PROGRAM-DATE-TIME tag'
});
}
this.lastProgramDateTime = null;
this.trigger('end');
}
/**
@ -1605,7 +1657,7 @@ class Parser extends Stream {
*
* @param {Object} options a map of options for the added parser
* @param {RegExp} options.expression a regular expression to match the custom header
* @param {string} options.type the type to register to the output
* @param {string} options.customType the custom type to register to the output
* @param {Function} [options.dataParser] function to parse the line into an object
* @param {boolean} [options.segment] should tag data be attached to the segment object
*/

View file

@ -1,4 +1,4 @@
/*! @name m3u8-parser @version 6.2.0 @license Apache-2.0 */
/*! @name m3u8-parser @version 7.1.0 @license Apache-2.0 */
(function (global, factory) {
typeof exports === 'object' && typeof module !== 'undefined' ? factory(exports) :
typeof define === 'function' && define.amd ? define(['exports'], factory) :
@ -881,6 +881,18 @@
tagType: 'independent-segments'
});
return;
}
match = /^#EXT-X-CONTENT-STEERING:(.*)$/.exec(newLine);
if (match) {
event = {
type: 'tag',
tagType: 'content-steering'
};
event.attributes = parseAttributes(match[1]);
this.trigger('data', event);
return;
} // unknown tag type
@ -1056,6 +1068,7 @@
this.lineStream = new LineStream();
this.parseStream = new ParseStream();
this.lineStream.pipe(this.parseStream);
this.lastProgramDateTime = null;
/* eslint-disable consistent-this */
const self = this;
@ -1086,6 +1099,7 @@
this.manifest = {
allowCache: true,
discontinuityStarts: [],
dateRanges: [],
segments: []
}; // keep track of the last seen segment's byte range end, as segments are not required
// to provide the offset, in which case it defaults to the next byte after the
@ -1094,7 +1108,7 @@
let lastByterangeEnd = 0; // keep track of the last seen part's byte range end.
let lastPartByterangeEnd = 0;
const daterangeTags = {};
const dateRangeTags = {};
this.on('end', () => {
// only add preloadSegment if we don't yet have a uri for it.
// and we actually have parts/preloadHints
@ -1190,6 +1204,10 @@
});
}
if (entry.title) {
currentUri.title = entry.title;
}
if (entry.duration > 0) {
currentUri.duration = entry.duration;
}
@ -1442,6 +1460,21 @@
currentUri.dateTimeString = entry.dateTimeString;
currentUri.dateTimeObject = entry.dateTimeObject;
const {
lastProgramDateTime
} = this;
this.lastProgramDateTime = new Date(entry.dateTimeString).getTime(); // We should extrapolate Program Date Time backward only during first program date time occurrence.
// Once we have at least one program date time point, we can always extrapolate it forward using lastProgramDateTime reference.
if (lastProgramDateTime === null) {
// Extrapolate Program Date Time backward
// Since it is first program date time occurrence we're assuming that
// all this.manifest.segments have no program date time info
this.manifest.segments.reduceRight((programDateTime, segment) => {
segment.programDateTime = programDateTime - segment.duration * 1000;
return segment.programDateTime;
}, this.lastProgramDateTime);
}
},
targetduration() {
@ -1605,72 +1638,77 @@
},
'daterange'() {
this.manifest.daterange = this.manifest.daterange || [];
this.manifest.daterange.push(camelCaseKeys(entry.attributes));
const index = this.manifest.daterange.length - 1;
this.manifest.dateRanges.push(camelCaseKeys(entry.attributes));
const index = this.manifest.dateRanges.length - 1;
this.warnOnMissingAttributes_(`#EXT-X-DATERANGE #${index}`, entry.attributes, ['ID', 'START-DATE']);
const daterange = this.manifest.daterange[index];
const dateRange = this.manifest.dateRanges[index];
if (daterange.endDate && daterange.startDate && new Date(daterange.endDate) < new Date(daterange.startDate)) {
if (dateRange.endDate && dateRange.startDate && new Date(dateRange.endDate) < new Date(dateRange.startDate)) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE END-DATE must be equal to or later than the value of the START-DATE'
});
}
if (daterange.duration && daterange.duration < 0) {
if (dateRange.duration && dateRange.duration < 0) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE DURATION must not be negative'
});
}
if (daterange.plannedDuration && daterange.plannedDuration < 0) {
if (dateRange.plannedDuration && dateRange.plannedDuration < 0) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE PLANNED-DURATION must not be negative'
});
}
const endOnNextYes = !!daterange.endOnNext;
const endOnNextYes = !!dateRange.endOnNext;
if (endOnNextYes && !daterange.class) {
if (endOnNextYes && !dateRange.class) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE with an END-ON-NEXT=YES attribute must have a CLASS attribute'
});
}
if (endOnNextYes && (daterange.duration || daterange.endDate)) {
if (endOnNextYes && (dateRange.duration || dateRange.endDate)) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE with an END-ON-NEXT=YES attribute must not contain DURATION or END-DATE attributes'
});
}
if (daterange.duration && daterange.endDate) {
const startDate = daterange.startDate;
const newDateInSeconds = startDate.setSeconds(startDate.getSeconds() + daterange.duration);
this.manifest.daterange[index].endDate = new Date(newDateInSeconds);
if (dateRange.duration && dateRange.endDate) {
const startDate = dateRange.startDate;
const newDateInSeconds = startDate.getTime() + dateRange.duration * 1000;
this.manifest.dateRanges[index].endDate = new Date(newDateInSeconds);
}
if (daterange && !this.manifest.dateTimeString) {
this.trigger('warn', {
message: 'A playlist with EXT-X-DATERANGE tag must contain atleast one EXT-X-PROGRAM-DATE-TIME tag'
});
}
if (!daterangeTags[daterange.id]) {
daterangeTags[daterange.id] = daterange;
if (!dateRangeTags[dateRange.id]) {
dateRangeTags[dateRange.id] = dateRange;
} else {
for (const attribute in daterangeTags[daterange.id]) {
if (daterangeTags[daterange.id][attribute] !== daterange[attribute]) {
for (const attribute in dateRangeTags[dateRange.id]) {
if (!!dateRange[attribute] && JSON.stringify(dateRangeTags[dateRange.id][attribute]) !== JSON.stringify(dateRange[attribute])) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE tags with the same ID in a playlist must have the same attributes and same attribute values'
message: 'EXT-X-DATERANGE tags with the same ID in a playlist must have the same attributes values'
});
break;
}
}
} // if tags with the same ID do not have conflicting attributes, merge them
const dateRangeWithSameId = this.manifest.dateRanges.findIndex(dateRangeToFind => dateRangeToFind.id === dateRange.id);
this.manifest.dateRanges[dateRangeWithSameId] = _extends$1(this.manifest.dateRanges[dateRangeWithSameId], dateRange);
dateRangeTags[dateRange.id] = _extends$1(dateRangeTags[dateRange.id], dateRange); // after merging, delete the duplicate dateRange that was added last
this.manifest.dateRanges.pop();
}
},
'independent-segments'() {
this.manifest.independentSegments = true;
},
'content-steering'() {
this.manifest.contentSteering = camelCaseKeys(entry.attributes);
this.warnOnMissingAttributes_('#EXT-X-CONTENT-STEERING', entry.attributes, ['SERVER-URI']);
}
})[entry.tagType] || noop).call(self);
@ -1699,7 +1737,13 @@
} // reset the last byterange end as it needs to be 0 between parts
lastPartByterangeEnd = 0; // prepare for the next URI
lastPartByterangeEnd = 0; // Once we have at least one program date time we can always extrapolate it forward
if (this.lastProgramDateTime !== null) {
currentUri.programDateTime = this.lastProgramDateTime;
this.lastProgramDateTime += currentUri.duration * 1000;
} // prepare for the next URI
currentUri = {};
},
@ -1756,6 +1800,14 @@
end() {
// flush any buffered input
this.lineStream.push('\n');
if (this.manifest.dateRanges.length && this.lastProgramDateTime === null) {
this.trigger('warn', {
message: 'A playlist with EXT-X-DATERANGE tag must contain atleast one EXT-X-PROGRAM-DATE-TIME tag'
});
}
this.lastProgramDateTime = null;
this.trigger('end');
}
/**
@ -1763,7 +1815,7 @@
*
* @param {Object} options a map of options for the added parser
* @param {RegExp} options.expression a regular expression to match the custom header
* @param {string} options.type the type to register to the output
* @param {string} options.customType the custom type to register to the output
* @param {Function} [options.dataParser] function to parse the line into an object
* @param {boolean} [options.segment] should tag data be attached to the segment object
*/

File diff suppressed because one or more lines are too long

View file

@ -1,6 +1,6 @@
{
"name": "m3u8-parser",
"version": "6.2.0",
"version": "7.1.0",
"description": "m3u8 parser",
"main": "dist/m3u8-parser.cjs.js",
"module": "dist/m3u8-parser.es.js",

View file

@ -624,6 +624,16 @@ export default class ParseStream extends Stream {
});
return;
}
match = (/^#EXT-X-CONTENT-STEERING:(.*)$/).exec(newLine);
if (match) {
event = {
type: 'tag',
tagType: 'content-steering'
};
event.attributes = parseAttributes(match[1]);
this.trigger('data', event);
return;
}
// unknown tag type
this.trigger('data', {

View file

@ -97,6 +97,8 @@ export default class Parser extends Stream {
this.parseStream = new ParseStream();
this.lineStream.pipe(this.parseStream);
this.lastProgramDateTime = null;
/* eslint-disable consistent-this */
const self = this;
/* eslint-enable consistent-this */
@ -124,6 +126,7 @@ export default class Parser extends Stream {
this.manifest = {
allowCache: true,
discontinuityStarts: [],
dateRanges: [],
segments: []
};
// keep track of the last seen segment's byte range end, as segments are not required
@ -132,7 +135,7 @@ export default class Parser extends Stream {
let lastByterangeEnd = 0;
// keep track of the last seen part's byte range end.
let lastPartByterangeEnd = 0;
const daterangeTags = {};
const dateRangeTags = {};
this.on('end', () => {
// only add preloadSegment if we don't yet have a uri for it.
@ -221,6 +224,11 @@ export default class Parser extends Stream {
message: 'defaulting discontinuity sequence to zero'
});
}
if (entry.title) {
currentUri.title = entry.title;
}
if (entry.duration > 0) {
currentUri.duration = entry.duration;
}
@ -459,9 +467,24 @@ export default class Parser extends Stream {
this.manifest.dateTimeString = entry.dateTimeString;
this.manifest.dateTimeObject = entry.dateTimeObject;
}
currentUri.dateTimeString = entry.dateTimeString;
currentUri.dateTimeObject = entry.dateTimeObject;
const { lastProgramDateTime } = this;
this.lastProgramDateTime = new Date(entry.dateTimeString).getTime();
// We should extrapolate Program Date Time backward only during first program date time occurrence.
// Once we have at least one program date time point, we can always extrapolate it forward using lastProgramDateTime reference.
if (lastProgramDateTime === null) {
// Extrapolate Program Date Time backward
// Since it is first program date time occurrence we're assuming that
// all this.manifest.segments have no program date time info
this.manifest.segments.reduceRight((programDateTime, segment) => {
segment.programDateTime = programDateTime - (segment.duration * 1000);
return segment.programDateTime;
}, this.lastProgramDateTime);
}
},
targetduration() {
if (!isFinite(entry.duration) || entry.duration < 0) {
@ -634,70 +657,79 @@ export default class Parser extends Stream {
setHoldBack.call(this, this.manifest);
},
'daterange'() {
this.manifest.daterange = this.manifest.daterange || [];
this.manifest.daterange.push(camelCaseKeys(entry.attributes));
const index = this.manifest.daterange.length - 1;
this.manifest.dateRanges.push(camelCaseKeys(entry.attributes));
const index = this.manifest.dateRanges.length - 1;
this.warnOnMissingAttributes_(
`#EXT-X-DATERANGE #${index}`,
entry.attributes,
['ID', 'START-DATE']
);
const daterange = this.manifest.daterange[index];
const dateRange = this.manifest.dateRanges[index];
if (daterange.endDate && daterange.startDate && new Date(daterange.endDate) < new Date(daterange.startDate)) {
if (dateRange.endDate && dateRange.startDate && new Date(dateRange.endDate) < new Date(dateRange.startDate)) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE END-DATE must be equal to or later than the value of the START-DATE'
});
}
if (daterange.duration && daterange.duration < 0) {
if (dateRange.duration && dateRange.duration < 0) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE DURATION must not be negative'
});
}
if (daterange.plannedDuration && daterange.plannedDuration < 0) {
if (dateRange.plannedDuration && dateRange.plannedDuration < 0) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE PLANNED-DURATION must not be negative'
});
}
const endOnNextYes = !!daterange.endOnNext;
const endOnNextYes = !!dateRange.endOnNext;
if (endOnNextYes && !daterange.class) {
if (endOnNextYes && !dateRange.class) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE with an END-ON-NEXT=YES attribute must have a CLASS attribute'
});
}
if (endOnNextYes && (daterange.duration || daterange.endDate)) {
if (endOnNextYes && (dateRange.duration || dateRange.endDate)) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE with an END-ON-NEXT=YES attribute must not contain DURATION or END-DATE attributes'
});
}
if (daterange.duration && daterange.endDate) {
const startDate = daterange.startDate;
const newDateInSeconds = startDate.setSeconds(startDate.getSeconds() + daterange.duration);
if (dateRange.duration && dateRange.endDate) {
const startDate = dateRange.startDate;
const newDateInSeconds = startDate.getTime() + (dateRange.duration * 1000);
this.manifest.daterange[index].endDate = new Date(newDateInSeconds);
this.manifest.dateRanges[index].endDate = new Date(newDateInSeconds);
}
if (daterange && !this.manifest.dateTimeString) {
this.trigger('warn', {
message: 'A playlist with EXT-X-DATERANGE tag must contain atleast one EXT-X-PROGRAM-DATE-TIME tag'
});
}
if (!daterangeTags[daterange.id]) {
daterangeTags[daterange.id] = daterange;
if (!dateRangeTags[dateRange.id]) {
dateRangeTags[dateRange.id] = dateRange;
} else {
for (const attribute in daterangeTags[daterange.id]) {
if (daterangeTags[daterange.id][attribute] !== daterange[attribute]) {
for (const attribute in dateRangeTags[dateRange.id]) {
if (!!dateRange[attribute] && JSON.stringify(dateRangeTags[dateRange.id][attribute]) !== JSON.stringify(dateRange[attribute])) {
this.trigger('warn', {
message: 'EXT-X-DATERANGE tags with the same ID in a playlist must have the same attributes and same attribute values'
message: 'EXT-X-DATERANGE tags with the same ID in a playlist must have the same attributes values'
});
break;
}
}
// if tags with the same ID do not have conflicting attributes, merge them
const dateRangeWithSameId = this.manifest.dateRanges.findIndex((dateRangeToFind) => dateRangeToFind.id === dateRange.id);
this.manifest.dateRanges[dateRangeWithSameId] = Object.assign(this.manifest.dateRanges[dateRangeWithSameId], dateRange);
dateRangeTags[dateRange.id] = Object.assign(dateRangeTags[dateRange.id], dateRange);
// after merging, delete the duplicate dateRange that was added last
this.manifest.dateRanges.pop();
}
},
'independent-segments'() {
this.manifest.independentSegments = true;
},
'content-steering'() {
this.manifest.contentSteering = camelCaseKeys(entry.attributes);
this.warnOnMissingAttributes_(
'#EXT-X-CONTENT-STEERING',
entry.attributes,
['SERVER-URI']
);
}
})[entry.tagType] || noop).call(self);
},
@ -725,6 +757,12 @@ export default class Parser extends Stream {
// reset the last byterange end as it needs to be 0 between parts
lastPartByterangeEnd = 0;
// Once we have at least one program date time we can always extrapolate it forward
if (this.lastProgramDateTime !== null) {
currentUri.programDateTime = this.lastProgramDateTime;
this.lastProgramDateTime += currentUri.duration * 1000;
}
// prepare for the next URI
currentUri = {};
},
@ -777,7 +815,13 @@ export default class Parser extends Stream {
end() {
// flush any buffered input
this.lineStream.push('\n');
if (this.manifest.dateRanges.length && this.lastProgramDateTime === null) {
this.trigger('warn', {
message: 'A playlist with EXT-X-DATERANGE tag must contain atleast one EXT-X-PROGRAM-DATE-TIME tag'
});
}
this.lastProgramDateTime = null;
this.trigger('end');
}
/**
@ -785,7 +829,7 @@ export default class Parser extends Stream {
*
* @param {Object} options a map of options for the added parser
* @param {RegExp} options.expression a regular expression to match the custom header
* @param {string} options.type the type to register to the output
* @param {string} options.customType the custom type to register to the output
* @param {Function} [options.dataParser] function to parse the line into an object
* @param {boolean} [options.segment] should tag data be attached to the segment object
*/

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,6 +1,7 @@
module.exports = {
allowCache: true,
mediaSequence: 0,
dateRanges: [],
playlistType: 'VOD',
segments: [
{

View file

@ -1,6 +1,7 @@
module.exports = {
allowCache: true,
mediaSequence: 0,
dateRanges: [],
playlistType: 'VOD',
segments: [
{

View file

@ -1,6 +1,7 @@
module.exports = {
allowCache: true,
discontinuityStarts: [],
dateRanges: [],
mediaGroups: {
// TYPE
'AUDIO': {

View file

@ -1,6 +1,7 @@
module.exports = {
allowCache: true,
discontinuityStarts: [],
dateRanges: [],
mediaGroups: {
'AUDIO': {
aac: {

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
playlists: [
{
attributes: {

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,11 +1,13 @@
module.exports = {
allowCache: false,
mediaSequence: 0,
dateRanges: [],
playlistType: 'VOD',
segments: [
{
dateTimeString: '2016-06-22T09:20:16.166-04:00',
dateTimeObject: new Date('2016-06-22T09:20:16.166-04:00'),
programDateTime: 1466601616166,
duration: 10,
timeline: 0,
uri: 'hls_450k_video.ts'
@ -13,6 +15,7 @@ module.exports = {
{
dateTimeString: '2016-06-22T09:20:26.166-04:00',
dateTimeObject: new Date('2016-06-22T09:20:26.166-04:00'),
programDateTime: 1466601626166,
duration: 10,
timeline: 0,
uri: 'hls_450k_video.ts'

View file

@ -2,6 +2,7 @@ module.exports = {
allowCache: true,
discontinuitySequence: 0,
discontinuityStarts: [],
dateRanges: [],
mediaSequence: 7794,
segments: [
{

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: false,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,28 +1,33 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
discontinuitySequence: 3,
segments: [
{
duration: 10,
timeline: 3,
uri: '001.ts'
uri: '001.ts',
title: '0'
},
{
duration: 19,
timeline: 3,
uri: '002.ts'
uri: '002.ts',
title: '0'
},
{
discontinuity: true,
duration: 10,
timeline: 4,
uri: '003.ts'
uri: '003.ts',
title: '0'
},
{
duration: 11,
timeline: 4,
uri: '004.ts'
uri: '004.ts',
title: '0'
}
],
targetDuration: 19,

View file

@ -1,55 +1,65 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
discontinuitySequence: 0,
segments: [
{
duration: 10,
timeline: 0,
uri: '001.ts'
uri: '001.ts',
title: '0'
},
{
duration: 19,
timeline: 0,
uri: '002.ts'
uri: '002.ts',
title: '0'
},
{
discontinuity: true,
duration: 10,
timeline: 1,
uri: '003.ts'
uri: '003.ts',
title: '0'
},
{
duration: 11,
timeline: 1,
uri: '004.ts'
uri: '004.ts',
title: '0'
},
{
discontinuity: true,
duration: 10,
timeline: 2,
uri: '005.ts'
uri: '005.ts',
title: '0'
},
{
duration: 10,
timeline: 2,
uri: '006.ts'
uri: '006.ts',
title: '0'
},
{
duration: 10,
timeline: 2,
uri: '007.ts'
uri: '007.ts',
title: '0'
},
{
discontinuity: true,
duration: 10,
timeline: 3,
uri: '008.ts'
uri: '008.ts',
title: '0'
},
{
duration: 16,
timeline: 3,
uri: '009.ts'
uri: '009.ts',
title: '0'
}
],
targetDuration: 19,

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
discontinuityStarts: [],
segments: []
};

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,27 +1,32 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [
{
duration: 6.64,
timeline: 0,
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts'
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts',
title: '{}'
},
{
duration: 6.08,
timeline: 0,
uri: '/test/ts-files/tvy7/56be1cef869a1c0cc8e38864ad1add17d187f051-hi720.ts'
uri: '/test/ts-files/tvy7/56be1cef869a1c0cc8e38864ad1add17d187f051-hi720.ts',
title: '{}'
},
{
duration: 6.6,
timeline: 0,
uri: '/test/ts-files/tvy7/549c8c77f55f049741a06596e5c1e01dacaa46d0-hi720.ts'
uri: '/test/ts-files/tvy7/549c8c77f55f049741a06596e5c1e01dacaa46d0-hi720.ts',
title: '{}'
},
{
duration: 5,
timeline: 0,
uri: '/test/ts-files/tvy7/6cfa378684ffeb1c455a64dae6c103290a1f53d4-hi720.ts'
uri: '/test/ts-files/tvy7/6cfa378684ffeb1c455a64dae6c103290a1f53d4-hi720.ts',
title: '{}'
}
],
targetDuration: 8,

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
segments: [
{

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
playlists: [
{
attributes: {

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 7794,
discontinuitySequence: 0,
discontinuityStarts: [],

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'EVENT',
segments: [

View file

@ -1,11 +1,13 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 1,
segments: [
{
duration: 6.64,
timeline: 0,
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts'
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts',
title: '{}'
}
],
targetDuration: 8,

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [
@ -19,7 +20,8 @@ module.exports = {
},
duration: 10,
timeline: 0,
uri: 'hls_450k_video.ts'
uri: 'hls_450k_video.ts',
title: ';asljasdfii11)))00,'
},
{
byterange: {

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 1,
playlistType: 'VOD',
targetDuration: 6,

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
discontinuityStarts: [],
segments: []
};

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,27 +1,32 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [
{
duration: 6.64,
timeline: 0,
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts'
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts',
title: '{}'
},
{
duration: 6.08,
timeline: 0,
uri: '/test/ts-files/tvy7/56be1cef869a1c0cc8e38864ad1add17d187f051-hi720.ts'
uri: '/test/ts-files/tvy7/56be1cef869a1c0cc8e38864ad1add17d187f051-hi720.ts',
title: '{}'
},
{
duration: 6.6,
timeline: 0,
uri: '/test/ts-files/tvy7/549c8c77f55f049741a06596e5c1e01dacaa46d0-hi720.ts'
uri: '/test/ts-files/tvy7/549c8c77f55f049741a06596e5c1e01dacaa46d0-hi720.ts',
title: '{}'
},
{
duration: 5,
timeline: 0,
uri: '/test/ts-files/tvy7/6cfa378684ffeb1c455a64dae6c103290a1f53d4-hi720.ts'
uri: '/test/ts-files/tvy7/6cfa378684ffeb1c455a64dae6c103290a1f53d4-hi720.ts',
title: '{}'
}
],
targetDuration: 8,

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
segments: [
{

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,12 +1,14 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [
{
duration: 6.64,
timeline: 0,
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts'
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts',
title: '{}'
},
{
duration: 8,

View file

@ -1,51 +1,61 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
segments: [
{
duration: 10,
timeline: 0,
uri: '001.ts'
uri: '001.ts',
title: '0'
},
{
duration: 19,
timeline: 0,
uri: '002.ts'
uri: '002.ts',
title: '0'
},
{
duration: 10,
timeline: 0,
uri: '003.ts'
uri: '003.ts',
title: '0'
},
{
duration: 11,
timeline: 0,
uri: '004.ts'
uri: '004.ts',
title: '0'
},
{
duration: 10,
timeline: 0,
uri: '005.ts'
uri: '005.ts',
title: '0'
},
{
duration: 10,
timeline: 0,
uri: '006.ts'
uri: '006.ts',
title: '0'
},
{
duration: 10,
timeline: 0,
uri: '007.ts'
uri: '007.ts',
title: '0'
},
{
duration: 10,
timeline: 0,
uri: '008.ts'
uri: '008.ts',
title: '0'
},
{
duration: 16,
timeline: 0,
uri: '009.ts'
uri: '009.ts',
title: '0'
}
],
targetDuration: 10,

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
discontinuitySequence: 0,
discontinuityStarts: [],
mediaSequence: 0,

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
discontinuitySequence: 0,
discontinuityStarts: [],
mediaSequence: 0,

View file

@ -2,6 +2,7 @@ module.exports = {
allowCache: true,
dateTimeObject: new Date('2019-02-14T02:13:36.106Z'),
dateTimeString: '2019-02-14T02:13:36.106Z',
dateRanges: [],
discontinuitySequence: 0,
discontinuityStarts: [],
mediaSequence: 266,
@ -40,6 +41,7 @@ module.exports = {
{
dateTimeObject: new Date('2019-02-14T02:13:36.106Z'),
dateTimeString: '2019-02-14T02:13:36.106Z',
programDateTime: 1550110416106,
duration: 4.00008,
map: {
uri: 'init.mp4'
@ -52,6 +54,7 @@ module.exports = {
map: {
uri: 'init.mp4'
},
programDateTime: 1550110420106.08,
timeline: 0,
uri: 'fileSequence267.mp4'
},
@ -60,6 +63,7 @@ module.exports = {
map: {
uri: 'init.mp4'
},
programDateTime: 1550110424106.1602,
timeline: 0,
uri: 'fileSequence268.mp4'
},
@ -68,6 +72,7 @@ module.exports = {
map: {
uri: 'init.mp4'
},
programDateTime: 1550110428106.2402,
timeline: 0,
uri: 'fileSequence269.mp4'
},
@ -76,6 +81,7 @@ module.exports = {
map: {
uri: 'init.mp4'
},
programDateTime: 1550110432106.3203,
timeline: 0,
uri: 'fileSequence270.mp4'
},
@ -84,6 +90,7 @@ module.exports = {
map: {
uri: 'init.mp4'
},
programDateTime: 1550110436106.4004,
timeline: 0,
uri: 'fileSequence271.mp4',
parts: [
@ -146,6 +153,7 @@ module.exports = {
map: {
uri: 'init.mp4'
},
programDateTime: 1550110440106,
timeline: 0,
uri: 'fileSequence272.mp4',
parts: [

View file

@ -2,6 +2,7 @@ module.exports = {
allowCache: true,
dateTimeObject: new Date('2019-02-14T02:14:00.106Z'),
dateTimeString: '2019-02-14T02:14:00.106Z',
dateRanges: [],
discontinuitySequence: 0,
discontinuityStarts: [],
mediaSequence: 266,
@ -42,16 +43,19 @@ module.exports = {
segments: [
{
duration: 4.00008,
programDateTime: 1550110428105.7598,
timeline: 0,
uri: 'fileSequence269.mp4'
},
{
duration: 4.00008,
programDateTime: 1550110432105.8398,
timeline: 0,
uri: 'fileSequence270.mp4'
},
{
duration: 4.00008,
programDateTime: 1550110436105.92,
timeline: 0,
uri: 'fileSequence271.mp4',
parts: [
@ -111,6 +115,7 @@ module.exports = {
dateTimeObject: new Date('2019-02-14T02:14:00.106Z'),
dateTimeString: '2019-02-14T02:14:00.106Z',
duration: 4.00008,
programDateTime: 1550110440106,
timeline: 0,
uri: 'fileSequence272.mp4',
parts: [

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
segments: [
{

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
segments: [
{

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
segments: [
{

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
discontinuityStarts: [],
mediaGroups: {
'AUDIO': {

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
playlists: [
{
attributes: {

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,27 +1,32 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [
{
duration: 6.64,
timeline: 0,
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts'
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts',
title: '{}'
},
{
duration: 6.08,
timeline: 0,
uri: '/test/ts-files/tvy7/56be1cef869a1c0cc8e38864ad1add17d187f051-hi720.ts'
uri: '/test/ts-files/tvy7/56be1cef869a1c0cc8e38864ad1add17d187f051-hi720.ts',
title: '{}'
},
{
duration: 6.6,
timeline: 0,
uri: '/test/ts-files/tvy7/549c8c77f55f049741a06596e5c1e01dacaa46d0-hi720.ts'
uri: '/test/ts-files/tvy7/549c8c77f55f049741a06596e5c1e01dacaa46d0-hi720.ts',
title: '{}'
},
{
duration: 5,
timeline: 0,
uri: '/test/ts-files/tvy7/6cfa378684ffeb1c455a64dae6c103290a1f53d4-hi720.ts'
uri: '/test/ts-files/tvy7/6cfa378684ffeb1c455a64dae6c103290a1f53d4-hi720.ts',
title: '{}'
}
],
targetDuration: 8,

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
segments: [
{

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,27 +1,32 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [
{
duration: 6.64,
timeline: 0,
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts'
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts',
title: '{}'
},
{
duration: 6.08,
timeline: 0,
uri: '/test/ts-files/tvy7/56be1cef869a1c0cc8e38864ad1add17d187f051-hi720.ts'
uri: '/test/ts-files/tvy7/56be1cef869a1c0cc8e38864ad1add17d187f051-hi720.ts',
title: '{}'
},
{
duration: 6.6,
timeline: 0,
uri: '/test/ts-files/tvy7/549c8c77f55f049741a06596e5c1e01dacaa46d0-hi720.ts'
uri: '/test/ts-files/tvy7/549c8c77f55f049741a06596e5c1e01dacaa46d0-hi720.ts',
title: '{}'
},
{
duration: 5,
timeline: 0,
uri: '/test/ts-files/tvy7/6cfa378684ffeb1c455a64dae6c103290a1f53d4-hi720.ts'
uri: '/test/ts-files/tvy7/6cfa378684ffeb1c455a64dae6c103290a1f53d4-hi720.ts',
title: '{}'
}
],
targetDuration: 8,

View file

@ -1,12 +1,14 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [
{
duration: 6.64,
timeline: 0,
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts'
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts',
title: '{}'
},
{
duration: 8,

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
discontinuityStarts: [],
mediaGroups: {
'AUDIO': {

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
discontinuityStarts: [],
mediaGroups: {
'AUDIO': {

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
targetDuration: 10,
segments: [

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
discontinuityStarts: [],
mediaGroups: {
'AUDIO': {

View file

@ -1,27 +1,32 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: -11,
playlistType: 'VOD',
segments: [
{
duration: 6.64,
timeline: 0,
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts'
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts',
title: '{}'
},
{
duration: 6.08,
timeline: 0,
uri: '/test/ts-files/tvy7/56be1cef869a1c0cc8e38864ad1add17d187f051-hi720.ts'
uri: '/test/ts-files/tvy7/56be1cef869a1c0cc8e38864ad1add17d187f051-hi720.ts',
title: '{}'
},
{
duration: 6.6,
timeline: 0,
uri: '/test/ts-files/tvy7/549c8c77f55f049741a06596e5c1e01dacaa46d0-hi720.ts'
uri: '/test/ts-files/tvy7/549c8c77f55f049741a06596e5c1e01dacaa46d0-hi720.ts',
title: '{}'
},
{
duration: 5,
timeline: 0,
uri: '/test/ts-files/tvy7/6cfa378684ffeb1c455a64dae6c103290a1f53d4-hi720.ts'
uri: '/test/ts-files/tvy7/6cfa378684ffeb1c455a64dae6c103290a1f53d4-hi720.ts',
title: '{}'
}
],
targetDuration: 8,

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,12 +1,14 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 17,
playlistType: 'VOD',
segments: [
{
duration: 6.64,
timeline: 0,
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts'
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts',
title: '{}'
}
],
targetDuration: 8,

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
playlists: [
{
attributes: {

View file

@ -1,27 +1,32 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 11,
playlistType: 'VOD',
segments: [
{
duration: 6.64,
timeline: 0,
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts'
uri: '/test/ts-files/tvy7/8a5e2822668b5370f4eb1438b2564fb7ab12ffe1-hi720.ts',
title: '{}'
},
{
duration: 6.08,
timeline: 0,
uri: '/test/ts-files/tvy7/56be1cef869a1c0cc8e38864ad1add17d187f051-hi720.ts'
uri: '/test/ts-files/tvy7/56be1cef869a1c0cc8e38864ad1add17d187f051-hi720.ts',
title: '{}'
},
{
duration: 6.6,
timeline: 0,
uri: '/test/ts-files/tvy7/549c8c77f55f049741a06596e5c1e01dacaa46d0-hi720.ts'
uri: '/test/ts-files/tvy7/549c8c77f55f049741a06596e5c1e01dacaa46d0-hi720.ts',
title: '{}'
},
{
duration: 5,
timeline: 0,
uri: '/test/ts-files/tvy7/6cfa378684ffeb1c455a64dae6c103290a1f53d4-hi720.ts'
uri: '/test/ts-files/tvy7/6cfa378684ffeb1c455a64dae6c103290a1f53d4-hi720.ts',
title: '{}'
}
],
targetDuration: 8,

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,5 +1,6 @@
module.exports = {
allowCache: true,
dateRanges: [],
mediaSequence: 0,
playlistType: 'VOD',
segments: [

View file

@ -1,6 +1,7 @@
module.exports = {
allowCache: true,
mediaSequence: 0,
dateRanges: [],
playlistType: 'VOD',
segments: [
{

View file

@ -698,6 +698,14 @@ QUnit.test('parses #EXT-X-STREAM-INF with common attributes', function(assert) {
'avc1.4d400d, mp4a.40.2',
'codecs are parsed'
);
manifest = '#EXT-X-STREAM-INF:PATHWAY-ID="CDN-A"\n';
this.lineStream.push(manifest);
assert.ok(element, 'an event was triggered');
assert.strictEqual(element.type, 'tag', 'the line type is tag');
assert.strictEqual(element.tagType, 'stream-inf', 'the tag type is stream-inf');
assert.strictEqual(element.attributes['PATHWAY-ID'], 'CDN-A', 'pathway-id is parsed');
});
QUnit.test('parses #EXT-X-STREAM-INF with arbitrary attributes', function(assert) {
const manifest = '#EXT-X-STREAM-INF:NUMERIC=24,ALPHA=Value,MIXED=123abc\n';

View file

@ -845,6 +845,71 @@ QUnit.module('m3u8s', function(hooks) {
);
});
QUnit.test('PDT value is assigned to segments with explicit #EXT-X-PROGRAM-DATE-TIME tags', function(assert) {
this.parser.push([
'#EXTM3U',
'#EXT-X-VERSION:6',
'#EXT-X-TARGETDURATION:8',
'#EXT-X-MEDIA-SEQUENCE:0',
'#EXTINF:8.0',
'#EXT-X-PROGRAM-DATE-TIME:2017-07-31T20:35:35.053+00:00',
'https://example.com/playlist1.m3u8',
'#EXTINF:8.0,',
'#EXT-X-PROGRAM-DATE-TIME:2017-07-31T22:14:10.053+00:00',
'https://example.com/playlist2.m3u8',
'#EXT-X-ENDLIST'
].join('\n'));
this.parser.end();
assert.equal(this.parser.manifest.segments[0].programDateTime, new Date('2017-07-31T20:35:35.053+00:00').getTime());
assert.equal(this.parser.manifest.segments[1].programDateTime, new Date('2017-07-31T22:14:10.053+00:00').getTime());
});
QUnit.test('backfill PDT values when the first EXT-X-PROGRAM-DATE-TIME tag appears after one or more Media Segment URIs', function(assert) {
this.parser.push([
'#EXTM3U',
'#EXT-X-VERSION:6',
'#EXT-X-TARGETDURATION:8',
'#EXT-X-MEDIA-SEQUENCE:0',
'#EXTINF:8.0',
'https://example.com/playlist1.m3u8',
'#EXTINF:8.0,',
'https://example.com/playlist2.m3u8',
'#EXTINF:8.0',
'#EXT-X-PROGRAM-DATE-TIME:2017-07-31T20:35:35.053+00:00',
'https://example.com/playlist3.m3u8',
'#EXT-X-ENDLIST'
].join('\n'));
this.parser.end();
const segments = this.parser.manifest.segments;
assert.equal(segments[2].programDateTime, new Date('2017-07-31T20:35:35.053+00:00').getTime());
assert.equal(segments[1].programDateTime, segments[2].programDateTime - (segments[1].duration * 1000));
assert.equal(segments[0].programDateTime, segments[1].programDateTime - (segments[0].duration * 1000));
});
QUnit.test('extrapolates forward when subsequent fragments do not have explicit PDT tags', function(assert) {
this.parser.push([
'#EXTM3U',
'#EXT-X-VERSION:6',
'#EXT-X-TARGETDURATION:8',
'#EXT-X-MEDIA-SEQUENCE:0',
'#EXTINF:8.0',
'#EXT-X-PROGRAM-DATE-TIME:2017-07-31T20:35:35.053+00:00',
'https://example.com/playlist1.m3u8',
'#EXTINF:8.0,',
'https://example.com/playlist2.m3u8',
'#EXTINF:8.0',
'https://example.com/playlist3.m3u8',
'#EXT-X-ENDLIST'
].join('\n'));
this.parser.end();
const segments = this.parser.manifest.segments;
assert.equal(segments[0].programDateTime, new Date('2017-07-31T20:35:35.053+00:00').getTime());
assert.equal(segments[1].programDateTime, segments[0].programDateTime + segments[1].duration * 1000);
assert.equal(segments[2].programDateTime, segments[1].programDateTime + segments[2].duration * 1000);
});
QUnit.test('warns when #EXT-X-DATERANGE missing attribute', function(assert) {
this.parser.push([
'#EXT-X-VERSION:3',
@ -966,7 +1031,7 @@ QUnit.module('m3u8s', function(hooks) {
);
});
QUnit.test('warns when playlist has multiple #EXT-X-DATERANGE tag same ID but different attribute names and values', function(assert) {
QUnit.test('warns when playlist has multiple #EXT-X-DATERANGE tag same ID but different attribute values', function(assert) {
this.parser.push([
'#EXT-X-VERSION:3',
'#EXT-X-MEDIA-SEQUENCE:0',
@ -976,12 +1041,12 @@ QUnit.module('m3u8s', function(hooks) {
'#EXT-X-ENDLIST',
'#EXT-X-PROGRAM-DATE-TIME:2017-07-31T20:35:35.053+00:00',
'#EXT-X-DATERANGE:ID="12345",START-DATE="2023-04-13T18:16:15.840000Z",END-ON-NEXT=YES,CLASS="CLASSATTRIBUTE"',
'#EXT-X-DATERANGE:ID="12345",START-DATE="2023-04-13T18:16:20.840000Z"'
'#EXT-X-DATERANGE:ID="12345",START-DATE="2023-04-13T18:16:15.840000Z",CLASS="CLASSATTRIBUTE1"'
].join('\n'));
this.parser.end();
const warnings = [
'EXT-X-DATERANGE tags with the same ID in a playlist must have the same attributes and same attribute values'
'EXT-X-DATERANGE tags with the same ID in a playlist must have the same attributes values'
];
assert.deepEqual(
@ -1004,7 +1069,7 @@ QUnit.module('m3u8s', function(hooks) {
].join('\n'));
this.parser.end();
assert.deepEqual(this.parser.manifest.daterange[0].endDate, new Date('2023-04-13T15:16:29.840000Z'));
assert.deepEqual(this.parser.manifest.dateRanges[0].endDate, new Date('2023-04-13T15:16:29.840000Z'));
});
QUnit.test('warns when playlist contains #EXT-X-DATERANGE tag but no #EXT-X-PROGRAM-DATE-TIME', function(assert) {
@ -1030,7 +1095,33 @@ QUnit.module('m3u8s', function(hooks) {
);
});
QUnit.test(' playlist with multiple ext-x-daterange ', function(assert) {
QUnit.test('playlist with multiple ext-x-daterange with same ID but no conflicting attributes', function(assert) {
const expectedDateRange = {
id: '12345',
scte35In: '0xFC30200FFF2',
scte35Out: '0xFC30200FFF2',
startDate: new Date('2023-04-13T18:16:15.840000Z'),
class: 'CLASSATTRIBUTE'
};
this.parser.push([
'#EXT-X-VERSION:3',
'#EXT-X-MEDIA-SEQUENCE:0',
'#EXT-X-DISCONTINUITY-SEQUENCE:0',
'#EXTINF:10,',
'media-00001.ts',
'#EXT-X-ENDLIST',
'#EXT-X-PROGRAM-DATE-TIME:2017-07-31T20:35:35.053+00:00',
'#EXT-X-DATERANGE:ID="12345",SCTE35-IN=0xFC30200FFF2,START-DATE="2023-04-13T18:16:15.840000Z",CLASS="CLASSATTRIBUTE"',
'#EXT-X-DATERANGE:ID="12345",SCTE35-OUT=0xFC30200FFF2,START-DATE="2023-04-13T18:16:15.840000Z"'
].join('\n'));
this.parser.end();
assert.equal(this.parser.manifest.dateRanges.length, 1, 'two dateranges with same ID are merged');
assert.deepEqual(this.parser.manifest.dateRanges[0], expectedDateRange);
});
QUnit.test('playlist with multiple ext-x-daterange ', function(assert) {
this.parser.push([
' #EXTM3U',
'#EXT-X-VERSION:6',
@ -1053,7 +1144,7 @@ QUnit.module('m3u8s', function(hooks) {
'#EXT-X-ENDLIST'
].join('\n'));
this.parser.end();
assert.equal(this.parser.manifest.daterange.length, 3);
assert.equal(this.parser.manifest.dateRanges.length, 3);
});
QUnit.test('parses #EXT-X-INDEPENDENT-SEGMENTS', function(assert) {
@ -1067,6 +1158,35 @@ QUnit.module('m3u8s', function(hooks) {
assert.equal(this.parser.manifest.independentSegments, true);
});
QUnit.test('parses #EXT-X-CONTENT-STEERING', function(assert) {
const expectedContentSteeringObject = {
serverUri: '/foo?bar=00012',
pathwayId: 'CDN-A'
};
this.parser.push('#EXT-X-CONTENT-STEERING:SERVER-URI="/foo?bar=00012",PATHWAY-ID="CDN-A"');
this.parser.end();
assert.deepEqual(this.parser.manifest.contentSteering, expectedContentSteeringObject);
});
QUnit.test('parses #EXT-X-CONTENT-STEERING without PATHWAY-ID', function(assert) {
const expectedContentSteeringObject = {
serverUri: '/bar?foo=00012'
};
this.parser.push('#EXT-X-CONTENT-STEERING:SERVER-URI="/bar?foo=00012"');
this.parser.end();
assert.deepEqual(this.parser.manifest.contentSteering, expectedContentSteeringObject);
});
QUnit.test('warns on #EXT-X-CONTENT-STEERING missing SERVER-URI', function(assert) {
const warning = ['#EXT-X-CONTENT-STEERING lacks required attribute(s): SERVER-URI'];
this.parser.push('#EXT-X-CONTENT-STEERING:PATHWAY-ID="CDN-A"');
this.parser.end();
assert.deepEqual(this.warnings, warning, 'warnings as expected');
});
QUnit.module('integration');
for (const key in testDataExpected) {

View file

@ -0,0 +1,190 @@
<a name="7.0.0"></a>
# [7.0.0](https://github.com/videojs/mux.js/compare/v6.3.0...v7.0.0) (2023-07-21)
### Features
* add position data to captions ([#434](https://github.com/videojs/mux.js/issues/434)) ([30f2132](https://github.com/videojs/mux.js/commit/30f2132))
### Chores
* add npm publish step to the release workflow ([a8306cd](https://github.com/videojs/mux.js/commit/a8306cd))
* rename workflow name from github-release to release and add discussion category name for github releases ([4ba1607](https://github.com/videojs/mux.js/commit/4ba1607))
* Update CI and release workflows ([#431](https://github.com/videojs/mux.js/issues/431)) ([dc56f1b](https://github.com/videojs/mux.js/commit/dc56f1b))
* update collaborator guide md ([51b3ed4](https://github.com/videojs/mux.js/commit/51b3ed4))
* update git push suggestion in collaborator guide md ([73a5b60](https://github.com/videojs/mux.js/commit/73a5b60))
<a name="6.3.0"></a>
# [6.3.0](https://github.com/videojs/mux.js/compare/v6.2.0...v6.3.0) (2023-02-22)
### Features
* support emsg box parsing ([2e77285](https://github.com/videojs/mux.js/commit/2e77285))
### Bug Fixes
* emsg ie11 test failures ([528e9ed](https://github.com/videojs/mux.js/commit/528e9ed))
<a name="6.2.0"></a>
# [6.2.0](https://github.com/videojs/mux.js/compare/v6.1.0...v6.2.0) (2022-07-08)
### Features
* add ID3 parsing for text, link, and APIC frames ([#412](https://github.com/videojs/mux.js/issues/412)) ([5454bdd](https://github.com/videojs/mux.js/commit/5454bdd))
### Bug Fixes
* replace indexOf with typedArrayIndexOf for IE11 support ([#417](https://github.com/videojs/mux.js/issues/417)) ([4e1b195](https://github.com/videojs/mux.js/commit/4e1b195))
<a name="6.1.0"></a>
# [6.1.0](https://github.com/videojs/mux.js/compare/v6.0.1...v6.1.0) (2022-05-26)
### Features
* send ID3 tag even when a frame has malformed content ([#408](https://github.com/videojs/mux.js/issues/408)) ([1da5d23](https://github.com/videojs/mux.js/commit/1da5d23))
<a name="6.0.1"></a>
## [6.0.1](https://github.com/videojs/mux.js/compare/v6.0.0...v6.0.1) (2021-12-20)
### Bug Fixes
* fix IE11 by replacing arrow function ([#406](https://github.com/videojs/mux.js/issues/406)) ([47302fe](https://github.com/videojs/mux.js/commit/47302fe))
<a name="6.0.0"></a>
# [6.0.0](https://github.com/videojs/mux.js/compare/v5.14.1...v6.0.0) (2021-11-29)
### Features
* use bigint for 64 bit ints if needed and available. ([#383](https://github.com/videojs/mux.js/issues/383)) ([83779b9](https://github.com/videojs/mux.js/commit/83779b9))
### Chores
* don't run tests on version ([#404](https://github.com/videojs/mux.js/issues/404)) ([45623ea](https://github.com/videojs/mux.js/commit/45623ea))
### BREAKING CHANGES
* In some cases, mux.js will now be returning a BigInt rather than a regular Number value. This means that consumers of this library will need to add checks for BigInt for optimal operation.
<a name="5.14.1"></a>
## [5.14.1](https://github.com/videojs/mux.js/compare/v5.14.0...v5.14.1) (2021-10-14)
### Bug Fixes
* avoid mismatch with avc1 and hvc1 codec ([#400](https://github.com/videojs/mux.js/issues/400)) ([8a58d6e](https://github.com/videojs/mux.js/commit/8a58d6e))
* prevent adding duplicate log listeners on every push after a flush ([#402](https://github.com/videojs/mux.js/issues/402)) ([eb332c1](https://github.com/videojs/mux.js/commit/eb332c1))
<a name="5.14.0"></a>
# [5.14.0](https://github.com/videojs/mux.js/compare/v5.13.0...v5.14.0) (2021-09-21)
### Features
* Add multibyte character support ([#398](https://github.com/videojs/mux.js/issues/398)) ([0849e0a](https://github.com/videojs/mux.js/commit/0849e0a))
<a name="5.13.0"></a>
# [5.13.0](https://github.com/videojs/mux.js/compare/v5.12.2...v5.13.0) (2021-08-24)
### Features
* add firstSequenceNumber option to Transmuxer to start sequence somewhere other than zero ([#395](https://github.com/videojs/mux.js/issues/395)) ([6ff42f4](https://github.com/videojs/mux.js/commit/6ff42f4))
### Chores
* add github release ci action ([#397](https://github.com/videojs/mux.js/issues/397)) ([abe7936](https://github.com/videojs/mux.js/commit/abe7936))
* update ci workflow to fix ci ([#396](https://github.com/videojs/mux.js/issues/396)) ([86cfdca](https://github.com/videojs/mux.js/commit/86cfdca))
<a name="5.12.2"></a>
## [5.12.2](https://github.com/videojs/mux.js/compare/v5.12.1...v5.12.2) (2021-07-14)
### Bug Fixes
* Do not scale width by sarRatio, let decoder handle it via the pasp box ([#393](https://github.com/videojs/mux.js/issues/393)) ([9e9982f](https://github.com/videojs/mux.js/commit/9e9982f))
<a name="5.12.1"></a>
## [5.12.1](https://github.com/videojs/mux.js/compare/v5.12.0...v5.12.1) (2021-07-09)
### Code Refactoring
* rename warn event to log, change console logs to log events ([#392](https://github.com/videojs/mux.js/issues/392)) ([4995603](https://github.com/videojs/mux.js/commit/4995603))
<a name="5.12.0"></a>
# [5.12.0](https://github.com/videojs/mux.js/compare/v5.11.3...v5.12.0) (2021-07-02)
### Features
* add general error/warn/debug log events and log skipped adts data ([#391](https://github.com/videojs/mux.js/issues/391)) ([6588d48](https://github.com/videojs/mux.js/commit/6588d48))
<a name="5.11.3"></a>
## [5.11.3](https://github.com/videojs/mux.js/compare/v5.11.2...v5.11.3) (2021-06-30)
### Bug Fixes
* Prevent skipping frames when we have garbage data between adts sync words ([#390](https://github.com/videojs/mux.js/issues/390)) ([71bac64](https://github.com/videojs/mux.js/commit/71bac64))
<a name="5.11.2"></a>
## [5.11.2](https://github.com/videojs/mux.js/compare/v5.11.1...v5.11.2) (2021-06-24)
### Bug Fixes
* on flush if a pmt has not been emitted and we have one, emit it ([#388](https://github.com/videojs/mux.js/issues/388)) ([67b4aab](https://github.com/videojs/mux.js/commit/67b4aab))
<a name="5.11.1"></a>
## [5.11.1](https://github.com/videojs/mux.js/compare/v5.11.0...v5.11.1) (2021-06-22)
### Bug Fixes
* inspect all program map tables for stream types ([#386](https://github.com/videojs/mux.js/issues/386)) ([bac4da9](https://github.com/videojs/mux.js/commit/bac4da9))
<a name="5.11.0"></a>
# [5.11.0](https://github.com/videojs/mux.js/compare/v5.10.0...v5.11.0) (2021-03-29)
### Features
* parse ctts atom in mp4 inspector ([#379](https://github.com/videojs/mux.js/issues/379)) ([b75a7a4](https://github.com/videojs/mux.js/commit/b75a7a4))
* stss atom parsing ([#380](https://github.com/videojs/mux.js/issues/380)) ([305eb4f](https://github.com/videojs/mux.js/commit/305eb4f))
<a name="5.10.0"></a>
# [5.10.0](https://github.com/videojs/mux.js/compare/v5.9.2...v5.10.0) (2021-03-05)
### Features
* parse edts boxes ([#375](https://github.com/videojs/mux.js/issues/375)) ([989bffd](https://github.com/videojs/mux.js/commit/989bffd))
### Bug Fixes
* Check if baseTimestamp is NaN ([#370](https://github.com/videojs/mux.js/issues/370)) ([b4e61dd](https://github.com/videojs/mux.js/commit/b4e61dd))
* only parse PES packets as PES packets ([#378](https://github.com/videojs/mux.js/issues/378)) ([bb984db](https://github.com/videojs/mux.js/commit/bb984db))
<a name="5.9.2"></a>
## [5.9.2](https://github.com/videojs/mux.js/compare/v5.9.1...v5.9.2) (2021-02-24)
### Features
* add a nodejs binary for transmux via command line ([#366](https://github.com/videojs/mux.js/issues/366)) ([b87ed0f](https://github.com/videojs/mux.js/commit/b87ed0f))
### Bug Fixes
* ts inspect ptsTime/dtsTime typo ([#377](https://github.com/videojs/mux.js/issues/377)) ([112e6e1](https://github.com/videojs/mux.js/commit/112e6e1))
### Chores
* switch to rollup-plugin-data-files ([#369](https://github.com/videojs/mux.js/issues/369)) ([0bb1556](https://github.com/videojs/mux.js/commit/0bb1556))
* update vjsverify to fix publish failure ([cb06bb5](https://github.com/videojs/mux.js/commit/cb06bb5))
<a name="5.9.1"></a>
## [5.9.1](https://github.com/videojs/mux.js/compare/v5.9.0...v5.9.1) (2021-01-20)
### Chores
* **package:** fixup browser field ([#368](https://github.com/videojs/mux.js/issues/368)) ([8926506](https://github.com/videojs/mux.js/commit/8926506))
<a name="5.9.0"></a>
# [5.9.0](https://github.com/videojs/mux.js/compare/v5.8.0...v5.9.0) (2021-01-20)
### Features
* **CaptionStream:** add flag to turn off 708 captions ([#365](https://github.com/videojs/mux.js/issues/365)) ([8a7cdb6](https://github.com/videojs/mux.js/commit/8a7cdb6))
### Chores
* update this project to use the generator ([#352](https://github.com/videojs/mux.js/issues/352)) ([fa920a6](https://github.com/videojs/mux.js/commit/fa920a6))

View file

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright Brightcove, Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View file

@ -0,0 +1,363 @@
# mux.js
[![Build Status](https://travis-ci.org/videojs/mux.js.svg?branch=main)](https://travis-ci.org/videojs/mux.js)[![Greenkeeper badge](https://badges.greenkeeper.io/videojs/mux.js.svg)](https://greenkeeper.io/)
[![Slack Status](http://slack.videojs.com/badge.svg)](http://slack.videojs.com)
Lightweight utilities for inspecting and manipulating video container formats.
Maintenance Status: Stable
## Table of Contents
- [Installation](#installation)
- [NPM](#npm)
- [Manual Build](#manual-build)
- [Building](#building)
- [Collaborator](#collaborator)
- [Contributing](#contributing)
- [Options](#options)
- [Background](#background)
- [fmp4](#fmp4)
- [MPEG2-TS to fMP4 Transmuxer](#mpeg2-ts-to-fmp4-transmuxer)
- [Diagram](#diagram)
- [Usage Examples](#usage-examples)
- [Basic Usage](#basic-usage)
- [Metadata](#metadata)
- [MP4 Inspector](#mp4-inspector)
- [Documentation](#documentation)
- [Talk to Us](#talk-to-us)
## Installation
### NPM
To install `mux.js` with npm run
```bash
npm install --save mux.js
```
### Manual Build
Download a copy of this git repository and then follow the steps in [Building](#building)
## Building
If you're using this project in a node-like environment, just `require()` whatever you need. If you'd like to package up a distribution to include separately, run `npm run build`. See the `package.json` for other handy scripts if you're thinking about contributing.
## Collaborator
If you are a collaborator, we have a guide on how to [release](./COLLABORATOR_GUIDE.md#releasing) the project.
## Contributing
If you are interested in contributing to `mux.js`, take a look at our docs on [streams](/docs/streams.md) to get started.
## Options
The exported `muxjs` object contains the following modules:
- [codecs](#codecs): a module for handling various codecs
- [mp4](#mp4): a module for handling ISOBMFF MP4 boxes
- [flv](#flv): a module for handling Flash content
- [mp2t](#mp2t): a module for handling MPEG 2 Transport Stream content
### Codecs
#### Adts
`muxjs.codecs.Adts`
The Adts(Audio Data Transport Stream) module handles audio data, specifically AAC. Includes an `AdtsStream` that takes ADTS audio and parses out AAC frames to pass on to the next Stream component in a pipeline.
#### h264
`muxjs.codecs.h264`
The h264 module Handles H264 bitstreams, including a `NalByteStream` and `H264Stream` to parse out NAL Units and pass them on to the next Stream component in a pipeline.
### mp4
#### MP4 Generator
`muxjs.mp4.generator`
The MP4 Generator module contains multiple functions that can be used to generate fragmented MP4s (fmp4s) that can be used in MSE.
#### MP4 Probe
`muxjs.mp4.probe`
The MP4 Probe contains basic utilites that can be used to parse metadata about an MP4 segment. Some examples include: `timescale` and getting the base media decode time of a fragment in seconds.
#### MP4 Transmuxer
`muxjs.mp4.Transmuxer`
Takes MPEG2-TS segments and transmuxes them into fmp4 segments.
Options:
##### baseMediaDecodeTime
Type: `number`
Default: `0`
The Base Media Decode Time of the first segment to be passed into the transmuxer.
##### keepOriginalTimestamps
Type: `boolean`
Default: `false`
The default behavior of the MP4 Transmuxer is to rewrite the timestamps of media segments to ensure that they begin at `0` on the media timeline in MSE. To avoid this behavior, you may set this option to `true`.
**Note**: This will affect behavior of captions and metadata, and these may not align with audio and video without additional manipulation of timing metadata.
##### remux
Type: `boolean`
Default: `true`
Set to `true` to remux audio and video into a single MP4 segment.
#### CaptionParser
`muxjs.mp4.CaptionParser`
This module reads CEA-608 captions out of FMP4 segments.
#### Tools
`muxjs.mp4.tools`
This module includes utilities to parse MP4s into an equivalent javascript object, primarily for debugging purposes.
### flv
#### Transmuxer
`muxjs.flv.Transmuxer`
Takes MPEG2-TS segments and transmuxes them into FLV segments. This module is in maintenance mode and will not have further major development.
#### Tools
`muxjs.flv.tools`
This module includes utilities to parse FLV tags into an equivalent javascript object, primarily for debugging purposes.
### mp2t
`muxjs.mp2t`
Contains Streams specifically to handle MPEG2-TS data, for example `ElementaryStream` and `TransportPacketStream`. This is used in the MP4 module.
#### CaptionStream
`muxjs.mp2t.CaptionStream`
Handles the bulk of parsing CEA-608 captions out of MPEG2-TS segments.
#### Tools
`muxjs.mp2t.tools`
This module contains utilities to parse basic timing information out of MPEG2-TS segments.
## Background
### fMP4
Before making use of the Transmuxer it is best to understand the structure of a fragmented MP4 (fMP4).
fMP4's are structured in *boxes* as described in the ISOBMFF spec.
For a basic fMP4 to be valid it needs to have the following boxes:
1) ftyp (File Type Box)
2) moov (Movie Header Box)
3) moof (Movie Fragment Box)
4) mdat (Movie Data Box)
Every fMP4 stream needs to start with an `ftyp` and `moov` box which is then followed by many `moof` and `mdat` pairs.
It is important to understand that when you append your first segment to [Media Source Extensions](https://www.w3.org/TR/media-source/) that this segment will need to start with an `ftyp` and `moov` followed by a `moof` and `mdat`. A segment containing a `ftyp` and `moov` box is often referred to as an Initialization Segment(`init`) segment, and segments containing `moof` and `mdat` boxes, referring to media itself as Media Segments.
If you would like to see a clearer representation of your fMP4 you can use the `muxjs.mp4.tools.inspect()` method.
## MPEG2-TS to fMP4 Transmuxer
### Diagram
![mux.js diagram](/docs/diagram.png)
## Usage Examples
### Basic Usage
To make use of the Transmuxer method you will need to push data to the transmuxer you have created.
Feed in `Uint8Array`s of an MPEG-2 transport stream, get out a fragmented MP4.
Lets look at a very basic representation of what needs to happen the first time you want to append a fMP4 to an MSE buffer.
```js
// Create your transmuxer:
// initOptions is optional and can be omitted at this time.
var transmuxer = new muxjs.mp4.Transmuxer(initOptions);
// Create an event listener which will be triggered after the transmuxer processes data:
// 'data' events signal a new fMP4 segment is ready
transmuxer.on('data', function (segment) {
// This code will be executed when the event listener is triggered by a Transmuxer.push() method execution.
// Create an empty Uint8Array with the summed value of both the initSegment and data byteLength properties.
let data = new Uint8Array(segment.initSegment.byteLength + segment.data.byteLength);
// Add the segment.initSegment (ftyp/moov) starting at position 0
data.set(segment.initSegment, 0);
// Add the segment.data (moof/mdat) starting after the initSegment
data.set(segment.data, segment.initSegment.byteLength);
// Uncomment this line below to see the structure of your new fMP4
// console.log(muxjs.mp4.tools.inspect(data));
// Add your brand new fMP4 segment to your MSE Source Buffer
sourceBuffer.appendBuffer(data);
});
// When you push your starting MPEG-TS segment it will cause the 'data' event listener above to run.
// It is important to push after your event listener has been defined.
transmuxer.push(transportStreamSegment);
transmuxer.flush();
```
Above we are adding in the `initSegment` (ftyp/moov) to our data array before appending to the MSE Source Buffer.
This is required for the first part of data we append to the MSE Source Buffer, but we will omit the `initSegment` for our remaining chunks (moof/mdat)'s of video we are going to append to our Source Buffer.
In the case of appending additional segments after your first segment we will just need to use the following event listener anonymous function:
```js
transmuxer.on('data', function(segment){
sourceBuffer.appendBuffer(new Uint8Array(segment.data));
});
```
Here we put all of this together in a very basic example player.
```html
<html>
<head>
<title>Basic Transmuxer Test</title>
</head>
<body>
<video controls width="80%"></video>
<script src="https://github.com/videojs/mux.js/releases/latest/download/mux.js"></script>
<script>
// Create array of TS files to play
segments = [
"segment-0.ts",
"segment-1.ts",
"segment-2.ts",
];
// Replace this value with your files codec info
mime = 'video/mp4; codecs="mp4a.40.2,avc1.64001f"';
let mediaSource = new MediaSource();
let transmuxer = new muxjs.mp4.Transmuxer();
video = document.querySelector('video');
video.src = URL.createObjectURL(mediaSource);
mediaSource.addEventListener("sourceopen", appendFirstSegment);
function appendFirstSegment(){
if (segments.length == 0){
return;
}
URL.revokeObjectURL(video.src);
sourceBuffer = mediaSource.addSourceBuffer(mime);
sourceBuffer.addEventListener('updateend', appendNextSegment);
transmuxer.on('data', (segment) => {
let data = new Uint8Array(segment.initSegment.byteLength + segment.data.byteLength);
data.set(segment.initSegment, 0);
data.set(segment.data, segment.initSegment.byteLength);
console.log(muxjs.mp4.tools.inspect(data));
sourceBuffer.appendBuffer(data);
})
fetch(segments.shift()).then((response)=>{
return response.arrayBuffer();
}).then((response)=>{
transmuxer.push(new Uint8Array(response));
transmuxer.flush();
})
}
function appendNextSegment(){
// reset the 'data' event listener to just append (moof/mdat) boxes to the Source Buffer
transmuxer.off('data');
transmuxer.on('data', (segment) =>{
sourceBuffer.appendBuffer(new Uint8Array(segment.data));
})
if (segments.length == 0){
// notify MSE that we have no more segments to append.
mediaSource.endOfStream();
}
segments.forEach((segment) => {
// fetch the next segment from the segments array and pass it into the transmuxer.push method
fetch(segments.shift()).then((response)=>{
return response.arrayBuffer();
}).then((response)=>{
transmuxer.push(new Uint8Array(response));
transmuxer.flush();
})
})
}
</script>
</body>
</html>
```
*NOTE: This player is only for example and should not be used in production.*
### Metadata
The transmuxer can also parse out supplementary video data like timed ID3 metadata and CEA-608 captions.
You can find both attached to the data event object:
```js
transmuxer.on('data', function (segment) {
// create a metadata text track cue for each ID3 frame:
segment.metadata.frames.forEach(function(frame) {
metadataTextTrack.addCue(new VTTCue(time, time, frame.value));
});
// create a VTTCue for all the parsed CEA-608 captions:>
segment.captions.forEach(function(cue) {
captionTextTrack.addCue(new VTTCue(cue.startTime, cue.endTime, cue.text));
});
});
```
### MP4 Inspector
Parse MP4s into javascript objects or a text representation for display or debugging:
```js
// drop in a Uint8Array of an MP4:
var parsed = muxjs.mp4.tools.inspect(bytes);
// dig into the boxes:
console.log('The major brand of the first box:', parsed[0].majorBrand);
// print out the structure of the MP4:
document.body.appendChild(document.createTextNode(muxjs.textifyMp4(parsed)));
```
The MP4 inspector is used extensively as a debugging tool for the transmuxer. You can see it in action by cloning the project and opening [the debug page](https://github.com/videojs/mux.js/blob/master/debug/index.html) in your browser.
## Documentation
Check out our [troubleshooting guide](/docs/troubleshooting.md).
We have some tips on [creating test content](/docs/test-content.md).
Also, check out our guide on [working with captions in Mux.js](/docs/captions.md).
## Talk to us
Drop by our slack channel (#playback) on the [Video.js slack](http://slack.videojs.com).

View file

@ -0,0 +1,126 @@
#!/usr/bin/env node
/* eslint-disable no-console */
const fs = require('fs');
const path = require('path');
const {Transmuxer} = require('../lib/mp4');
const {version} = require('../package.json');
const {concatTypedArrays} = require('@videojs/vhs-utils/cjs/byte-helpers');
const {ONE_SECOND_IN_TS} = require('../lib/utils/clock.js');
const showHelp = function() {
console.log(`
transmux media-file > foo.mp4
transmux media-file -o foo.mp4
curl -s 'some-media-ulr' | transmux.js -o foo.mp4
wget -O - -o /dev/null 'some-media-url' | transmux.js -o foo.mp4
transmux a supported segment (ts or adts) info an fmp4
-h, --help print help
-v, --version print the version
-o, --output <string> write to a file instead of stdout
-d, --debugger add a break point just before data goes to transmuxer
`);
};
const parseArgs = function(args) {
const options = {};
for (let i = 0; i < args.length; i++) {
const arg = args[i];
if ((/^--version|-v$/).test(arg)) {
console.log(`transmux.js v${version}`);
process.exit(0);
} else if ((/^--help|-h$/).test(arg)) {
showHelp();
process.exit(0);
} else if ((/^--debugger|-d$/).test(arg)) {
options.debugger = true;
} else if ((/^--output|-o$/).test(arg)) {
i++;
options.output = args[i];
} else {
options.file = arg;
}
}
return options;
};
const cli = function(stdin) {
const options = parseArgs(process.argv.slice(2));
let inputStream;
let outputStream;
// if stdin was provided
if (stdin && options.file) {
console.error(`You cannot pass in a file ${options.file} and pipe from stdin!`);
process.exit(1);
}
if (stdin) {
inputStream = process.stdin;
} else if (options.file) {
inputStream = fs.createReadStream(path.resolve(options.file));
}
if (!inputStream) {
console.error('A file or stdin must be passed in as an argument or via pipeing to this script!');
process.exit(1);
}
if (options.output) {
outputStream = fs.createWriteStream(path.resolve(options.output), {
encoding: null
});
} else {
outputStream = process.stdout;
}
return new Promise(function(resolve, reject) {
let allData;
inputStream.on('data', (chunk) => {
allData = concatTypedArrays(allData, chunk);
});
inputStream.on('error', reject);
inputStream.on('close', () => {
if (!allData || !allData.length) {
return reject('file is empty');
}
resolve(allData);
});
}).then(function(inputData) {
const transmuxer = new Transmuxer();
// Setting the BMDT to ensure that captions and id3 tags are not
// time-shifted by this value when they are output and instead are
// zero-based
transmuxer.setBaseMediaDecodeTime(ONE_SECOND_IN_TS);
transmuxer.on('data', function(data) {
if (data.initSegment) {
outputStream.write(concatTypedArrays(data.initSegment, data.data));
} else {
outputStream.write(data.data);
}
});
if (options.debugger) {
// eslint-disable-next-line
debugger;
}
transmuxer.push(inputData);
transmuxer.flush();
process.exit(0);
}).catch(function(e) {
console.error(e);
process.exit(1);
});
};
// no stdin if isTTY is set
cli(!process.stdin.isTTY ? process.stdin : null);

View file

@ -0,0 +1,125 @@
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*
* A stream-based aac to mp4 converter. This utility can be used to
* deliver mp4s to a SourceBuffer on platforms that support native
* Media Source Extensions.
*/
'use strict';
var Stream = require('../utils/stream.js');
var aacUtils = require('./utils'); // Constants
var _AacStream;
/**
* Splits an incoming stream of binary data into ADTS and ID3 Frames.
*/
_AacStream = function AacStream() {
var everything = new Uint8Array(),
timeStamp = 0;
_AacStream.prototype.init.call(this);
this.setTimestamp = function (timestamp) {
timeStamp = timestamp;
};
this.push = function (bytes) {
var frameSize = 0,
byteIndex = 0,
bytesLeft,
chunk,
packet,
tempLength; // If there are bytes remaining from the last segment, prepend them to the
// bytes that were pushed in
if (everything.length) {
tempLength = everything.length;
everything = new Uint8Array(bytes.byteLength + tempLength);
everything.set(everything.subarray(0, tempLength));
everything.set(bytes, tempLength);
} else {
everything = bytes;
}
while (everything.length - byteIndex >= 3) {
if (everything[byteIndex] === 'I'.charCodeAt(0) && everything[byteIndex + 1] === 'D'.charCodeAt(0) && everything[byteIndex + 2] === '3'.charCodeAt(0)) {
// Exit early because we don't have enough to parse
// the ID3 tag header
if (everything.length - byteIndex < 10) {
break;
} // check framesize
frameSize = aacUtils.parseId3TagSize(everything, byteIndex); // Exit early if we don't have enough in the buffer
// to emit a full packet
// Add to byteIndex to support multiple ID3 tags in sequence
if (byteIndex + frameSize > everything.length) {
break;
}
chunk = {
type: 'timed-metadata',
data: everything.subarray(byteIndex, byteIndex + frameSize)
};
this.trigger('data', chunk);
byteIndex += frameSize;
continue;
} else if ((everything[byteIndex] & 0xff) === 0xff && (everything[byteIndex + 1] & 0xf0) === 0xf0) {
// Exit early because we don't have enough to parse
// the ADTS frame header
if (everything.length - byteIndex < 7) {
break;
}
frameSize = aacUtils.parseAdtsSize(everything, byteIndex); // Exit early if we don't have enough in the buffer
// to emit a full packet
if (byteIndex + frameSize > everything.length) {
break;
}
packet = {
type: 'audio',
data: everything.subarray(byteIndex, byteIndex + frameSize),
pts: timeStamp,
dts: timeStamp
};
this.trigger('data', packet);
byteIndex += frameSize;
continue;
}
byteIndex++;
}
bytesLeft = everything.length - byteIndex;
if (bytesLeft > 0) {
everything = everything.subarray(byteIndex);
} else {
everything = new Uint8Array();
}
};
this.reset = function () {
everything = new Uint8Array();
this.trigger('reset');
};
this.endTimeline = function () {
everything = new Uint8Array();
this.trigger('endedtimeline');
};
};
_AacStream.prototype = new Stream();
module.exports = _AacStream;

View file

@ -0,0 +1,160 @@
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*
* Utilities to detect basic properties and metadata about Aac data.
*/
'use strict';
var ADTS_SAMPLING_FREQUENCIES = [96000, 88200, 64000, 48000, 44100, 32000, 24000, 22050, 16000, 12000, 11025, 8000, 7350];
var parseId3TagSize = function parseId3TagSize(header, byteIndex) {
var returnSize = header[byteIndex + 6] << 21 | header[byteIndex + 7] << 14 | header[byteIndex + 8] << 7 | header[byteIndex + 9],
flags = header[byteIndex + 5],
footerPresent = (flags & 16) >> 4; // if we get a negative returnSize clamp it to 0
returnSize = returnSize >= 0 ? returnSize : 0;
if (footerPresent) {
return returnSize + 20;
}
return returnSize + 10;
};
var getId3Offset = function getId3Offset(data, offset) {
if (data.length - offset < 10 || data[offset] !== 'I'.charCodeAt(0) || data[offset + 1] !== 'D'.charCodeAt(0) || data[offset + 2] !== '3'.charCodeAt(0)) {
return offset;
}
offset += parseId3TagSize(data, offset);
return getId3Offset(data, offset);
}; // TODO: use vhs-utils
var isLikelyAacData = function isLikelyAacData(data) {
var offset = getId3Offset(data, 0);
return data.length >= offset + 2 && (data[offset] & 0xFF) === 0xFF && (data[offset + 1] & 0xF0) === 0xF0 && // verify that the 2 layer bits are 0, aka this
// is not mp3 data but aac data.
(data[offset + 1] & 0x16) === 0x10;
};
var parseSyncSafeInteger = function parseSyncSafeInteger(data) {
return data[0] << 21 | data[1] << 14 | data[2] << 7 | data[3];
}; // return a percent-encoded representation of the specified byte range
// @see http://en.wikipedia.org/wiki/Percent-encoding
var percentEncode = function percentEncode(bytes, start, end) {
var i,
result = '';
for (i = start; i < end; i++) {
result += '%' + ('00' + bytes[i].toString(16)).slice(-2);
}
return result;
}; // return the string representation of the specified byte range,
// interpreted as ISO-8859-1.
var parseIso88591 = function parseIso88591(bytes, start, end) {
return unescape(percentEncode(bytes, start, end)); // jshint ignore:line
};
var parseAdtsSize = function parseAdtsSize(header, byteIndex) {
var lowThree = (header[byteIndex + 5] & 0xE0) >> 5,
middle = header[byteIndex + 4] << 3,
highTwo = header[byteIndex + 3] & 0x3 << 11;
return highTwo | middle | lowThree;
};
var parseType = function parseType(header, byteIndex) {
if (header[byteIndex] === 'I'.charCodeAt(0) && header[byteIndex + 1] === 'D'.charCodeAt(0) && header[byteIndex + 2] === '3'.charCodeAt(0)) {
return 'timed-metadata';
} else if (header[byteIndex] & 0xff === 0xff && (header[byteIndex + 1] & 0xf0) === 0xf0) {
return 'audio';
}
return null;
};
var parseSampleRate = function parseSampleRate(packet) {
var i = 0;
while (i + 5 < packet.length) {
if (packet[i] !== 0xFF || (packet[i + 1] & 0xF6) !== 0xF0) {
// If a valid header was not found, jump one forward and attempt to
// find a valid ADTS header starting at the next byte
i++;
continue;
}
return ADTS_SAMPLING_FREQUENCIES[(packet[i + 2] & 0x3c) >>> 2];
}
return null;
};
var parseAacTimestamp = function parseAacTimestamp(packet) {
var frameStart, frameSize, frame, frameHeader; // find the start of the first frame and the end of the tag
frameStart = 10;
if (packet[5] & 0x40) {
// advance the frame start past the extended header
frameStart += 4; // header size field
frameStart += parseSyncSafeInteger(packet.subarray(10, 14));
} // parse one or more ID3 frames
// http://id3.org/id3v2.3.0#ID3v2_frame_overview
do {
// determine the number of bytes in this frame
frameSize = parseSyncSafeInteger(packet.subarray(frameStart + 4, frameStart + 8));
if (frameSize < 1) {
return null;
}
frameHeader = String.fromCharCode(packet[frameStart], packet[frameStart + 1], packet[frameStart + 2], packet[frameStart + 3]);
if (frameHeader === 'PRIV') {
frame = packet.subarray(frameStart + 10, frameStart + frameSize + 10);
for (var i = 0; i < frame.byteLength; i++) {
if (frame[i] === 0) {
var owner = parseIso88591(frame, 0, i);
if (owner === 'com.apple.streaming.transportStreamTimestamp') {
var d = frame.subarray(i + 1);
var size = (d[3] & 0x01) << 30 | d[4] << 22 | d[5] << 14 | d[6] << 6 | d[7] >>> 2;
size *= 4;
size += d[7] & 0x03;
return size;
}
break;
}
}
}
frameStart += 10; // advance past the frame header
frameStart += frameSize; // advance past the frame body
} while (frameStart < packet.byteLength);
return null;
};
module.exports = {
isLikelyAacData: isLikelyAacData,
parseId3TagSize: parseId3TagSize,
parseAdtsSize: parseAdtsSize,
parseType: parseType,
parseSampleRate: parseSampleRate,
parseAacTimestamp: parseAacTimestamp
};

View file

@ -0,0 +1,149 @@
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*/
'use strict';
var Stream = require('../utils/stream.js');
var ONE_SECOND_IN_TS = require('../utils/clock').ONE_SECOND_IN_TS;
var _AdtsStream;
var ADTS_SAMPLING_FREQUENCIES = [96000, 88200, 64000, 48000, 44100, 32000, 24000, 22050, 16000, 12000, 11025, 8000, 7350];
/*
* Accepts a ElementaryStream and emits data events with parsed
* AAC Audio Frames of the individual packets. Input audio in ADTS
* format is unpacked and re-emitted as AAC frames.
*
* @see http://wiki.multimedia.cx/index.php?title=ADTS
* @see http://wiki.multimedia.cx/?title=Understanding_AAC
*/
_AdtsStream = function AdtsStream(handlePartialSegments) {
var buffer,
frameNum = 0;
_AdtsStream.prototype.init.call(this);
this.skipWarn_ = function (start, end) {
this.trigger('log', {
level: 'warn',
message: "adts skiping bytes " + start + " to " + end + " in frame " + frameNum + " outside syncword"
});
};
this.push = function (packet) {
var i = 0,
frameLength,
protectionSkipBytes,
frameEnd,
oldBuffer,
sampleCount,
adtsFrameDuration;
if (!handlePartialSegments) {
frameNum = 0;
}
if (packet.type !== 'audio') {
// ignore non-audio data
return;
} // Prepend any data in the buffer to the input data so that we can parse
// aac frames the cross a PES packet boundary
if (buffer && buffer.length) {
oldBuffer = buffer;
buffer = new Uint8Array(oldBuffer.byteLength + packet.data.byteLength);
buffer.set(oldBuffer);
buffer.set(packet.data, oldBuffer.byteLength);
} else {
buffer = packet.data;
} // unpack any ADTS frames which have been fully received
// for details on the ADTS header, see http://wiki.multimedia.cx/index.php?title=ADTS
var skip; // We use i + 7 here because we want to be able to parse the entire header.
// If we don't have enough bytes to do that, then we definitely won't have a full frame.
while (i + 7 < buffer.length) {
// Look for the start of an ADTS header..
if (buffer[i] !== 0xFF || (buffer[i + 1] & 0xF6) !== 0xF0) {
if (typeof skip !== 'number') {
skip = i;
} // If a valid header was not found, jump one forward and attempt to
// find a valid ADTS header starting at the next byte
i++;
continue;
}
if (typeof skip === 'number') {
this.skipWarn_(skip, i);
skip = null;
} // The protection skip bit tells us if we have 2 bytes of CRC data at the
// end of the ADTS header
protectionSkipBytes = (~buffer[i + 1] & 0x01) * 2; // Frame length is a 13 bit integer starting 16 bits from the
// end of the sync sequence
// NOTE: frame length includes the size of the header
frameLength = (buffer[i + 3] & 0x03) << 11 | buffer[i + 4] << 3 | (buffer[i + 5] & 0xe0) >> 5;
sampleCount = ((buffer[i + 6] & 0x03) + 1) * 1024;
adtsFrameDuration = sampleCount * ONE_SECOND_IN_TS / ADTS_SAMPLING_FREQUENCIES[(buffer[i + 2] & 0x3c) >>> 2]; // If we don't have enough data to actually finish this ADTS frame,
// then we have to wait for more data
if (buffer.byteLength - i < frameLength) {
break;
} // Otherwise, deliver the complete AAC frame
this.trigger('data', {
pts: packet.pts + frameNum * adtsFrameDuration,
dts: packet.dts + frameNum * adtsFrameDuration,
sampleCount: sampleCount,
audioobjecttype: (buffer[i + 2] >>> 6 & 0x03) + 1,
channelcount: (buffer[i + 2] & 1) << 2 | (buffer[i + 3] & 0xc0) >>> 6,
samplerate: ADTS_SAMPLING_FREQUENCIES[(buffer[i + 2] & 0x3c) >>> 2],
samplingfrequencyindex: (buffer[i + 2] & 0x3c) >>> 2,
// assume ISO/IEC 14496-12 AudioSampleEntry default of 16
samplesize: 16,
// data is the frame without it's header
data: buffer.subarray(i + 7 + protectionSkipBytes, i + frameLength)
});
frameNum++;
i += frameLength;
}
if (typeof skip === 'number') {
this.skipWarn_(skip, i);
skip = null;
} // remove processed bytes from the buffer.
buffer = buffer.subarray(i);
};
this.flush = function () {
frameNum = 0;
this.trigger('done');
};
this.reset = function () {
buffer = void 0;
this.trigger('reset');
};
this.endTimeline = function () {
buffer = void 0;
this.trigger('endedtimeline');
};
};
_AdtsStream.prototype = new Stream();
module.exports = _AdtsStream;

View file

@ -0,0 +1,571 @@
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*/
'use strict';
var Stream = require('../utils/stream.js');
var ExpGolomb = require('../utils/exp-golomb.js');
var _H264Stream, _NalByteStream;
var PROFILES_WITH_OPTIONAL_SPS_DATA;
/**
* Accepts a NAL unit byte stream and unpacks the embedded NAL units.
*/
_NalByteStream = function NalByteStream() {
var syncPoint = 0,
i,
buffer;
_NalByteStream.prototype.init.call(this);
/*
* Scans a byte stream and triggers a data event with the NAL units found.
* @param {Object} data Event received from H264Stream
* @param {Uint8Array} data.data The h264 byte stream to be scanned
*
* @see H264Stream.push
*/
this.push = function (data) {
var swapBuffer;
if (!buffer) {
buffer = data.data;
} else {
swapBuffer = new Uint8Array(buffer.byteLength + data.data.byteLength);
swapBuffer.set(buffer);
swapBuffer.set(data.data, buffer.byteLength);
buffer = swapBuffer;
}
var len = buffer.byteLength; // Rec. ITU-T H.264, Annex B
// scan for NAL unit boundaries
// a match looks like this:
// 0 0 1 .. NAL .. 0 0 1
// ^ sync point ^ i
// or this:
// 0 0 1 .. NAL .. 0 0 0
// ^ sync point ^ i
// advance the sync point to a NAL start, if necessary
for (; syncPoint < len - 3; syncPoint++) {
if (buffer[syncPoint + 2] === 1) {
// the sync point is properly aligned
i = syncPoint + 5;
break;
}
}
while (i < len) {
// look at the current byte to determine if we've hit the end of
// a NAL unit boundary
switch (buffer[i]) {
case 0:
// skip past non-sync sequences
if (buffer[i - 1] !== 0) {
i += 2;
break;
} else if (buffer[i - 2] !== 0) {
i++;
break;
} // deliver the NAL unit if it isn't empty
if (syncPoint + 3 !== i - 2) {
this.trigger('data', buffer.subarray(syncPoint + 3, i - 2));
} // drop trailing zeroes
do {
i++;
} while (buffer[i] !== 1 && i < len);
syncPoint = i - 2;
i += 3;
break;
case 1:
// skip past non-sync sequences
if (buffer[i - 1] !== 0 || buffer[i - 2] !== 0) {
i += 3;
break;
} // deliver the NAL unit
this.trigger('data', buffer.subarray(syncPoint + 3, i - 2));
syncPoint = i - 2;
i += 3;
break;
default:
// the current byte isn't a one or zero, so it cannot be part
// of a sync sequence
i += 3;
break;
}
} // filter out the NAL units that were delivered
buffer = buffer.subarray(syncPoint);
i -= syncPoint;
syncPoint = 0;
};
this.reset = function () {
buffer = null;
syncPoint = 0;
this.trigger('reset');
};
this.flush = function () {
// deliver the last buffered NAL unit
if (buffer && buffer.byteLength > 3) {
this.trigger('data', buffer.subarray(syncPoint + 3));
} // reset the stream state
buffer = null;
syncPoint = 0;
this.trigger('done');
};
this.endTimeline = function () {
this.flush();
this.trigger('endedtimeline');
};
};
_NalByteStream.prototype = new Stream(); // values of profile_idc that indicate additional fields are included in the SPS
// see Recommendation ITU-T H.264 (4/2013),
// 7.3.2.1.1 Sequence parameter set data syntax
PROFILES_WITH_OPTIONAL_SPS_DATA = {
100: true,
110: true,
122: true,
244: true,
44: true,
83: true,
86: true,
118: true,
128: true,
// TODO: the three profiles below don't
// appear to have sps data in the specificiation anymore?
138: true,
139: true,
134: true
};
/**
* Accepts input from a ElementaryStream and produces H.264 NAL unit data
* events.
*/
_H264Stream = function H264Stream() {
var nalByteStream = new _NalByteStream(),
self,
trackId,
currentPts,
currentDts,
discardEmulationPreventionBytes,
readSequenceParameterSet,
skipScalingList;
_H264Stream.prototype.init.call(this);
self = this;
/*
* Pushes a packet from a stream onto the NalByteStream
*
* @param {Object} packet - A packet received from a stream
* @param {Uint8Array} packet.data - The raw bytes of the packet
* @param {Number} packet.dts - Decode timestamp of the packet
* @param {Number} packet.pts - Presentation timestamp of the packet
* @param {Number} packet.trackId - The id of the h264 track this packet came from
* @param {('video'|'audio')} packet.type - The type of packet
*
*/
this.push = function (packet) {
if (packet.type !== 'video') {
return;
}
trackId = packet.trackId;
currentPts = packet.pts;
currentDts = packet.dts;
nalByteStream.push(packet);
};
/*
* Identify NAL unit types and pass on the NALU, trackId, presentation and decode timestamps
* for the NALUs to the next stream component.
* Also, preprocess caption and sequence parameter NALUs.
*
* @param {Uint8Array} data - A NAL unit identified by `NalByteStream.push`
* @see NalByteStream.push
*/
nalByteStream.on('data', function (data) {
var event = {
trackId: trackId,
pts: currentPts,
dts: currentDts,
data: data,
nalUnitTypeCode: data[0] & 0x1f
};
switch (event.nalUnitTypeCode) {
case 0x05:
event.nalUnitType = 'slice_layer_without_partitioning_rbsp_idr';
break;
case 0x06:
event.nalUnitType = 'sei_rbsp';
event.escapedRBSP = discardEmulationPreventionBytes(data.subarray(1));
break;
case 0x07:
event.nalUnitType = 'seq_parameter_set_rbsp';
event.escapedRBSP = discardEmulationPreventionBytes(data.subarray(1));
event.config = readSequenceParameterSet(event.escapedRBSP);
break;
case 0x08:
event.nalUnitType = 'pic_parameter_set_rbsp';
break;
case 0x09:
event.nalUnitType = 'access_unit_delimiter_rbsp';
break;
default:
break;
} // This triggers data on the H264Stream
self.trigger('data', event);
});
nalByteStream.on('done', function () {
self.trigger('done');
});
nalByteStream.on('partialdone', function () {
self.trigger('partialdone');
});
nalByteStream.on('reset', function () {
self.trigger('reset');
});
nalByteStream.on('endedtimeline', function () {
self.trigger('endedtimeline');
});
this.flush = function () {
nalByteStream.flush();
};
this.partialFlush = function () {
nalByteStream.partialFlush();
};
this.reset = function () {
nalByteStream.reset();
};
this.endTimeline = function () {
nalByteStream.endTimeline();
};
/**
* Advance the ExpGolomb decoder past a scaling list. The scaling
* list is optionally transmitted as part of a sequence parameter
* set and is not relevant to transmuxing.
* @param count {number} the number of entries in this scaling list
* @param expGolombDecoder {object} an ExpGolomb pointed to the
* start of a scaling list
* @see Recommendation ITU-T H.264, Section 7.3.2.1.1.1
*/
skipScalingList = function skipScalingList(count, expGolombDecoder) {
var lastScale = 8,
nextScale = 8,
j,
deltaScale;
for (j = 0; j < count; j++) {
if (nextScale !== 0) {
deltaScale = expGolombDecoder.readExpGolomb();
nextScale = (lastScale + deltaScale + 256) % 256;
}
lastScale = nextScale === 0 ? lastScale : nextScale;
}
};
/**
* Expunge any "Emulation Prevention" bytes from a "Raw Byte
* Sequence Payload"
* @param data {Uint8Array} the bytes of a RBSP from a NAL
* unit
* @return {Uint8Array} the RBSP without any Emulation
* Prevention Bytes
*/
discardEmulationPreventionBytes = function discardEmulationPreventionBytes(data) {
var length = data.byteLength,
emulationPreventionBytesPositions = [],
i = 1,
newLength,
newData; // Find all `Emulation Prevention Bytes`
while (i < length - 2) {
if (data[i] === 0 && data[i + 1] === 0 && data[i + 2] === 0x03) {
emulationPreventionBytesPositions.push(i + 2);
i += 2;
} else {
i++;
}
} // If no Emulation Prevention Bytes were found just return the original
// array
if (emulationPreventionBytesPositions.length === 0) {
return data;
} // Create a new array to hold the NAL unit data
newLength = length - emulationPreventionBytesPositions.length;
newData = new Uint8Array(newLength);
var sourceIndex = 0;
for (i = 0; i < newLength; sourceIndex++, i++) {
if (sourceIndex === emulationPreventionBytesPositions[0]) {
// Skip this byte
sourceIndex++; // Remove this position index
emulationPreventionBytesPositions.shift();
}
newData[i] = data[sourceIndex];
}
return newData;
};
/**
* Read a sequence parameter set and return some interesting video
* properties. A sequence parameter set is the H264 metadata that
* describes the properties of upcoming video frames.
* @param data {Uint8Array} the bytes of a sequence parameter set
* @return {object} an object with configuration parsed from the
* sequence parameter set, including the dimensions of the
* associated video frames.
*/
readSequenceParameterSet = function readSequenceParameterSet(data) {
var frameCropLeftOffset = 0,
frameCropRightOffset = 0,
frameCropTopOffset = 0,
frameCropBottomOffset = 0,
sarScale = 1,
expGolombDecoder,
profileIdc,
levelIdc,
profileCompatibility,
chromaFormatIdc,
picOrderCntType,
numRefFramesInPicOrderCntCycle,
picWidthInMbsMinus1,
picHeightInMapUnitsMinus1,
frameMbsOnlyFlag,
scalingListCount,
sarRatio = [1, 1],
aspectRatioIdc,
i;
expGolombDecoder = new ExpGolomb(data);
profileIdc = expGolombDecoder.readUnsignedByte(); // profile_idc
profileCompatibility = expGolombDecoder.readUnsignedByte(); // constraint_set[0-5]_flag
levelIdc = expGolombDecoder.readUnsignedByte(); // level_idc u(8)
expGolombDecoder.skipUnsignedExpGolomb(); // seq_parameter_set_id
// some profiles have more optional data we don't need
if (PROFILES_WITH_OPTIONAL_SPS_DATA[profileIdc]) {
chromaFormatIdc = expGolombDecoder.readUnsignedExpGolomb();
if (chromaFormatIdc === 3) {
expGolombDecoder.skipBits(1); // separate_colour_plane_flag
}
expGolombDecoder.skipUnsignedExpGolomb(); // bit_depth_luma_minus8
expGolombDecoder.skipUnsignedExpGolomb(); // bit_depth_chroma_minus8
expGolombDecoder.skipBits(1); // qpprime_y_zero_transform_bypass_flag
if (expGolombDecoder.readBoolean()) {
// seq_scaling_matrix_present_flag
scalingListCount = chromaFormatIdc !== 3 ? 8 : 12;
for (i = 0; i < scalingListCount; i++) {
if (expGolombDecoder.readBoolean()) {
// seq_scaling_list_present_flag[ i ]
if (i < 6) {
skipScalingList(16, expGolombDecoder);
} else {
skipScalingList(64, expGolombDecoder);
}
}
}
}
}
expGolombDecoder.skipUnsignedExpGolomb(); // log2_max_frame_num_minus4
picOrderCntType = expGolombDecoder.readUnsignedExpGolomb();
if (picOrderCntType === 0) {
expGolombDecoder.readUnsignedExpGolomb(); // log2_max_pic_order_cnt_lsb_minus4
} else if (picOrderCntType === 1) {
expGolombDecoder.skipBits(1); // delta_pic_order_always_zero_flag
expGolombDecoder.skipExpGolomb(); // offset_for_non_ref_pic
expGolombDecoder.skipExpGolomb(); // offset_for_top_to_bottom_field
numRefFramesInPicOrderCntCycle = expGolombDecoder.readUnsignedExpGolomb();
for (i = 0; i < numRefFramesInPicOrderCntCycle; i++) {
expGolombDecoder.skipExpGolomb(); // offset_for_ref_frame[ i ]
}
}
expGolombDecoder.skipUnsignedExpGolomb(); // max_num_ref_frames
expGolombDecoder.skipBits(1); // gaps_in_frame_num_value_allowed_flag
picWidthInMbsMinus1 = expGolombDecoder.readUnsignedExpGolomb();
picHeightInMapUnitsMinus1 = expGolombDecoder.readUnsignedExpGolomb();
frameMbsOnlyFlag = expGolombDecoder.readBits(1);
if (frameMbsOnlyFlag === 0) {
expGolombDecoder.skipBits(1); // mb_adaptive_frame_field_flag
}
expGolombDecoder.skipBits(1); // direct_8x8_inference_flag
if (expGolombDecoder.readBoolean()) {
// frame_cropping_flag
frameCropLeftOffset = expGolombDecoder.readUnsignedExpGolomb();
frameCropRightOffset = expGolombDecoder.readUnsignedExpGolomb();
frameCropTopOffset = expGolombDecoder.readUnsignedExpGolomb();
frameCropBottomOffset = expGolombDecoder.readUnsignedExpGolomb();
}
if (expGolombDecoder.readBoolean()) {
// vui_parameters_present_flag
if (expGolombDecoder.readBoolean()) {
// aspect_ratio_info_present_flag
aspectRatioIdc = expGolombDecoder.readUnsignedByte();
switch (aspectRatioIdc) {
case 1:
sarRatio = [1, 1];
break;
case 2:
sarRatio = [12, 11];
break;
case 3:
sarRatio = [10, 11];
break;
case 4:
sarRatio = [16, 11];
break;
case 5:
sarRatio = [40, 33];
break;
case 6:
sarRatio = [24, 11];
break;
case 7:
sarRatio = [20, 11];
break;
case 8:
sarRatio = [32, 11];
break;
case 9:
sarRatio = [80, 33];
break;
case 10:
sarRatio = [18, 11];
break;
case 11:
sarRatio = [15, 11];
break;
case 12:
sarRatio = [64, 33];
break;
case 13:
sarRatio = [160, 99];
break;
case 14:
sarRatio = [4, 3];
break;
case 15:
sarRatio = [3, 2];
break;
case 16:
sarRatio = [2, 1];
break;
case 255:
{
sarRatio = [expGolombDecoder.readUnsignedByte() << 8 | expGolombDecoder.readUnsignedByte(), expGolombDecoder.readUnsignedByte() << 8 | expGolombDecoder.readUnsignedByte()];
break;
}
}
if (sarRatio) {
sarScale = sarRatio[0] / sarRatio[1];
}
}
}
return {
profileIdc: profileIdc,
levelIdc: levelIdc,
profileCompatibility: profileCompatibility,
width: (picWidthInMbsMinus1 + 1) * 16 - frameCropLeftOffset * 2 - frameCropRightOffset * 2,
height: (2 - frameMbsOnlyFlag) * (picHeightInMapUnitsMinus1 + 1) * 16 - frameCropTopOffset * 2 - frameCropBottomOffset * 2,
// sar is sample aspect ratio
sarRatio: sarRatio
};
};
};
_H264Stream.prototype = new Stream();
module.exports = {
H264Stream: _H264Stream,
NalByteStream: _NalByteStream
};

View file

@ -0,0 +1,12 @@
"use strict";
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*/
module.exports = {
Adts: require('./adts'),
h264: require('./h264')
};

View file

@ -0,0 +1,5 @@
"use strict";
// constants
var AUDIO_PROPERTIES = ['audioobjecttype', 'channelcount', 'samplerate', 'samplingfrequencyindex', 'samplesize'];
module.exports = AUDIO_PROPERTIES;

View file

@ -0,0 +1,4 @@
"use strict";
var VIDEO_PROPERTIES = ['width', 'height', 'profileIdc', 'levelIdc', 'profileCompatibility', 'sarRatio'];
module.exports = VIDEO_PROPERTIES;

View file

@ -0,0 +1,53 @@
"use strict";
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*/
var highPrefix = [33, 16, 5, 32, 164, 27];
var lowPrefix = [33, 65, 108, 84, 1, 2, 4, 8, 168, 2, 4, 8, 17, 191, 252];
var zeroFill = function zeroFill(count) {
var a = [];
while (count--) {
a.push(0);
}
return a;
};
var makeTable = function makeTable(metaTable) {
return Object.keys(metaTable).reduce(function (obj, key) {
obj[key] = new Uint8Array(metaTable[key].reduce(function (arr, part) {
return arr.concat(part);
}, []));
return obj;
}, {});
};
var silence;
module.exports = function () {
if (!silence) {
// Frames-of-silence to use for filling in missing AAC frames
var coneOfSilence = {
96000: [highPrefix, [227, 64], zeroFill(154), [56]],
88200: [highPrefix, [231], zeroFill(170), [56]],
64000: [highPrefix, [248, 192], zeroFill(240), [56]],
48000: [highPrefix, [255, 192], zeroFill(268), [55, 148, 128], zeroFill(54), [112]],
44100: [highPrefix, [255, 192], zeroFill(268), [55, 163, 128], zeroFill(84), [112]],
32000: [highPrefix, [255, 192], zeroFill(268), [55, 234], zeroFill(226), [112]],
24000: [highPrefix, [255, 192], zeroFill(268), [55, 255, 128], zeroFill(268), [111, 112], zeroFill(126), [224]],
16000: [highPrefix, [255, 192], zeroFill(268), [55, 255, 128], zeroFill(268), [111, 255], zeroFill(269), [223, 108], zeroFill(195), [1, 192]],
12000: [lowPrefix, zeroFill(268), [3, 127, 248], zeroFill(268), [6, 255, 240], zeroFill(268), [13, 255, 224], zeroFill(268), [27, 253, 128], zeroFill(259), [56]],
11025: [lowPrefix, zeroFill(268), [3, 127, 248], zeroFill(268), [6, 255, 240], zeroFill(268), [13, 255, 224], zeroFill(268), [27, 255, 192], zeroFill(268), [55, 175, 128], zeroFill(108), [112]],
8000: [lowPrefix, zeroFill(268), [3, 121, 16], zeroFill(47), [7]]
};
silence = makeTable(coneOfSilence);
}
return silence;
};

View file

@ -0,0 +1,147 @@
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*/
'use strict';
var Stream = require('../utils/stream.js');
/**
* The final stage of the transmuxer that emits the flv tags
* for audio, video, and metadata. Also tranlates in time and
* outputs caption data and id3 cues.
*/
var CoalesceStream = function CoalesceStream(options) {
// Number of Tracks per output segment
// If greater than 1, we combine multiple
// tracks into a single segment
this.numberOfTracks = 0;
this.metadataStream = options.metadataStream;
this.videoTags = [];
this.audioTags = [];
this.videoTrack = null;
this.audioTrack = null;
this.pendingCaptions = [];
this.pendingMetadata = [];
this.pendingTracks = 0;
this.processedTracks = 0;
CoalesceStream.prototype.init.call(this); // Take output from multiple
this.push = function (output) {
// buffer incoming captions until the associated video segment
// finishes
if (output.content || output.text) {
return this.pendingCaptions.push(output);
} // buffer incoming id3 tags until the final flush
if (output.frames) {
return this.pendingMetadata.push(output);
}
if (output.track.type === 'video') {
this.videoTrack = output.track;
this.videoTags = output.tags;
this.pendingTracks++;
}
if (output.track.type === 'audio') {
this.audioTrack = output.track;
this.audioTags = output.tags;
this.pendingTracks++;
}
};
};
CoalesceStream.prototype = new Stream();
CoalesceStream.prototype.flush = function (flushSource) {
var id3,
caption,
i,
timelineStartPts,
event = {
tags: {},
captions: [],
captionStreams: {},
metadata: []
};
if (this.pendingTracks < this.numberOfTracks) {
if (flushSource !== 'VideoSegmentStream' && flushSource !== 'AudioSegmentStream') {
// Return because we haven't received a flush from a data-generating
// portion of the segment (meaning that we have only recieved meta-data
// or captions.)
return;
} else if (this.pendingTracks === 0) {
// In the case where we receive a flush without any data having been
// received we consider it an emitted track for the purposes of coalescing
// `done` events.
// We do this for the case where there is an audio and video track in the
// segment but no audio data. (seen in several playlists with alternate
// audio tracks and no audio present in the main TS segments.)
this.processedTracks++;
if (this.processedTracks < this.numberOfTracks) {
return;
}
}
}
this.processedTracks += this.pendingTracks;
this.pendingTracks = 0;
if (this.processedTracks < this.numberOfTracks) {
return;
}
if (this.videoTrack) {
timelineStartPts = this.videoTrack.timelineStartInfo.pts;
} else if (this.audioTrack) {
timelineStartPts = this.audioTrack.timelineStartInfo.pts;
}
event.tags.videoTags = this.videoTags;
event.tags.audioTags = this.audioTags; // Translate caption PTS times into second offsets into the
// video timeline for the segment, and add track info
for (i = 0; i < this.pendingCaptions.length; i++) {
caption = this.pendingCaptions[i];
caption.startTime = caption.startPts - timelineStartPts;
caption.startTime /= 90e3;
caption.endTime = caption.endPts - timelineStartPts;
caption.endTime /= 90e3;
event.captionStreams[caption.stream] = true;
event.captions.push(caption);
} // Translate ID3 frame PTS times into second offsets into the
// video timeline for the segment
for (i = 0; i < this.pendingMetadata.length; i++) {
id3 = this.pendingMetadata[i];
id3.cueTime = id3.pts - timelineStartPts;
id3.cueTime /= 90e3;
event.metadata.push(id3);
} // We add this to every single emitted segment even though we only need
// it for the first
event.metadata.dispatchType = this.metadataStream.dispatchType; // Reset stream state
this.videoTrack = null;
this.audioTrack = null;
this.videoTags = [];
this.audioTags = [];
this.pendingCaptions.length = 0;
this.pendingMetadata.length = 0;
this.pendingTracks = 0;
this.processedTracks = 0; // Emit the final segment
this.trigger('data', event);
this.trigger('done');
};
module.exports = CoalesceStream;

View file

@ -0,0 +1,62 @@
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*/
'use strict';
var FlvTag = require('./flv-tag.js'); // For information on the FLV format, see
// http://download.macromedia.com/f4v/video_file_format_spec_v10_1.pdf.
// Technically, this function returns the header and a metadata FLV tag
// if duration is greater than zero
// duration in seconds
// @return {object} the bytes of the FLV header as a Uint8Array
var getFlvHeader = function getFlvHeader(duration, audio, video) {
// :ByteArray {
var headBytes = new Uint8Array(3 + 1 + 1 + 4),
head = new DataView(headBytes.buffer),
metadata,
result,
metadataLength; // default arguments
duration = duration || 0;
audio = audio === undefined ? true : audio;
video = video === undefined ? true : video; // signature
head.setUint8(0, 0x46); // 'F'
head.setUint8(1, 0x4c); // 'L'
head.setUint8(2, 0x56); // 'V'
// version
head.setUint8(3, 0x01); // flags
head.setUint8(4, (audio ? 0x04 : 0x00) | (video ? 0x01 : 0x00)); // data offset, should be 9 for FLV v1
head.setUint32(5, headBytes.byteLength); // init the first FLV tag
if (duration <= 0) {
// no duration available so just write the first field of the first
// FLV tag
result = new Uint8Array(headBytes.byteLength + 4);
result.set(headBytes);
result.set([0, 0, 0, 0], headBytes.byteLength);
return result;
} // write out the duration metadata tag
metadata = new FlvTag(FlvTag.METADATA_TAG);
metadata.pts = metadata.dts = 0;
metadata.writeMetaDataDouble('duration', duration);
metadataLength = metadata.finalize().length;
result = new Uint8Array(headBytes.byteLength + metadataLength);
result.set(headBytes);
result.set(head.byteLength, metadataLength);
return result;
};
module.exports = getFlvHeader;

View file

@ -0,0 +1,372 @@
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*
* An object that stores the bytes of an FLV tag and methods for
* querying and manipulating that data.
* @see http://download.macromedia.com/f4v/video_file_format_spec_v10_1.pdf
*/
'use strict';
var _FlvTag; // (type:uint, extraData:Boolean = false) extends ByteArray
_FlvTag = function FlvTag(type, extraData) {
var // Counter if this is a metadata tag, nal start marker if this is a video
// tag. unused if this is an audio tag
adHoc = 0,
// :uint
// The default size is 16kb but this is not enough to hold iframe
// data and the resizing algorithm costs a bit so we create a larger
// starting buffer for video tags
bufferStartSize = 16384,
// checks whether the FLV tag has enough capacity to accept the proposed
// write and re-allocates the internal buffers if necessary
prepareWrite = function prepareWrite(flv, count) {
var bytes,
minLength = flv.position + count;
if (minLength < flv.bytes.byteLength) {
// there's enough capacity so do nothing
return;
} // allocate a new buffer and copy over the data that will not be modified
bytes = new Uint8Array(minLength * 2);
bytes.set(flv.bytes.subarray(0, flv.position), 0);
flv.bytes = bytes;
flv.view = new DataView(flv.bytes.buffer);
},
// commonly used metadata properties
widthBytes = _FlvTag.widthBytes || new Uint8Array('width'.length),
heightBytes = _FlvTag.heightBytes || new Uint8Array('height'.length),
videocodecidBytes = _FlvTag.videocodecidBytes || new Uint8Array('videocodecid'.length),
i;
if (!_FlvTag.widthBytes) {
// calculating the bytes of common metadata names ahead of time makes the
// corresponding writes faster because we don't have to loop over the
// characters
// re-test with test/perf.html if you're planning on changing this
for (i = 0; i < 'width'.length; i++) {
widthBytes[i] = 'width'.charCodeAt(i);
}
for (i = 0; i < 'height'.length; i++) {
heightBytes[i] = 'height'.charCodeAt(i);
}
for (i = 0; i < 'videocodecid'.length; i++) {
videocodecidBytes[i] = 'videocodecid'.charCodeAt(i);
}
_FlvTag.widthBytes = widthBytes;
_FlvTag.heightBytes = heightBytes;
_FlvTag.videocodecidBytes = videocodecidBytes;
}
this.keyFrame = false; // :Boolean
switch (type) {
case _FlvTag.VIDEO_TAG:
this.length = 16; // Start the buffer at 256k
bufferStartSize *= 6;
break;
case _FlvTag.AUDIO_TAG:
this.length = 13;
this.keyFrame = true;
break;
case _FlvTag.METADATA_TAG:
this.length = 29;
this.keyFrame = true;
break;
default:
throw new Error('Unknown FLV tag type');
}
this.bytes = new Uint8Array(bufferStartSize);
this.view = new DataView(this.bytes.buffer);
this.bytes[0] = type;
this.position = this.length;
this.keyFrame = extraData; // Defaults to false
// presentation timestamp
this.pts = 0; // decoder timestamp
this.dts = 0; // ByteArray#writeBytes(bytes:ByteArray, offset:uint = 0, length:uint = 0)
this.writeBytes = function (bytes, offset, length) {
var start = offset || 0,
end;
length = length || bytes.byteLength;
end = start + length;
prepareWrite(this, length);
this.bytes.set(bytes.subarray(start, end), this.position);
this.position += length;
this.length = Math.max(this.length, this.position);
}; // ByteArray#writeByte(value:int):void
this.writeByte = function (byte) {
prepareWrite(this, 1);
this.bytes[this.position] = byte;
this.position++;
this.length = Math.max(this.length, this.position);
}; // ByteArray#writeShort(value:int):void
this.writeShort = function (short) {
prepareWrite(this, 2);
this.view.setUint16(this.position, short);
this.position += 2;
this.length = Math.max(this.length, this.position);
}; // Negative index into array
// (pos:uint):int
this.negIndex = function (pos) {
return this.bytes[this.length - pos];
}; // The functions below ONLY work when this[0] == VIDEO_TAG.
// We are not going to check for that because we dont want the overhead
// (nal:ByteArray = null):int
this.nalUnitSize = function () {
if (adHoc === 0) {
return 0;
}
return this.length - (adHoc + 4);
};
this.startNalUnit = function () {
// remember position and add 4 bytes
if (adHoc > 0) {
throw new Error('Attempted to create new NAL wihout closing the old one');
} // reserve 4 bytes for nal unit size
adHoc = this.length;
this.length += 4;
this.position = this.length;
}; // (nal:ByteArray = null):void
this.endNalUnit = function (nalContainer) {
var nalStart, // :uint
nalLength; // :uint
// Rewind to the marker and write the size
if (this.length === adHoc + 4) {
// we started a nal unit, but didnt write one, so roll back the 4 byte size value
this.length -= 4;
} else if (adHoc > 0) {
nalStart = adHoc + 4;
nalLength = this.length - nalStart;
this.position = adHoc;
this.view.setUint32(this.position, nalLength);
this.position = this.length;
if (nalContainer) {
// Add the tag to the NAL unit
nalContainer.push(this.bytes.subarray(nalStart, nalStart + nalLength));
}
}
adHoc = 0;
};
/**
* Write out a 64-bit floating point valued metadata property. This method is
* called frequently during a typical parse and needs to be fast.
*/
// (key:String, val:Number):void
this.writeMetaDataDouble = function (key, val) {
var i;
prepareWrite(this, 2 + key.length + 9); // write size of property name
this.view.setUint16(this.position, key.length);
this.position += 2; // this next part looks terrible but it improves parser throughput by
// 10kB/s in my testing
// write property name
if (key === 'width') {
this.bytes.set(widthBytes, this.position);
this.position += 5;
} else if (key === 'height') {
this.bytes.set(heightBytes, this.position);
this.position += 6;
} else if (key === 'videocodecid') {
this.bytes.set(videocodecidBytes, this.position);
this.position += 12;
} else {
for (i = 0; i < key.length; i++) {
this.bytes[this.position] = key.charCodeAt(i);
this.position++;
}
} // skip null byte
this.position++; // write property value
this.view.setFloat64(this.position, val);
this.position += 8; // update flv tag length
this.length = Math.max(this.length, this.position);
++adHoc;
}; // (key:String, val:Boolean):void
this.writeMetaDataBoolean = function (key, val) {
var i;
prepareWrite(this, 2);
this.view.setUint16(this.position, key.length);
this.position += 2;
for (i = 0; i < key.length; i++) {
// if key.charCodeAt(i) >= 255, handle error
prepareWrite(this, 1);
this.bytes[this.position] = key.charCodeAt(i);
this.position++;
}
prepareWrite(this, 2);
this.view.setUint8(this.position, 0x01);
this.position++;
this.view.setUint8(this.position, val ? 0x01 : 0x00);
this.position++;
this.length = Math.max(this.length, this.position);
++adHoc;
}; // ():ByteArray
this.finalize = function () {
var dtsDelta, // :int
len; // :int
switch (this.bytes[0]) {
// Video Data
case _FlvTag.VIDEO_TAG:
// We only support AVC, 1 = key frame (for AVC, a seekable
// frame), 2 = inter frame (for AVC, a non-seekable frame)
this.bytes[11] = (this.keyFrame || extraData ? 0x10 : 0x20) | 0x07;
this.bytes[12] = extraData ? 0x00 : 0x01;
dtsDelta = this.pts - this.dts;
this.bytes[13] = (dtsDelta & 0x00FF0000) >>> 16;
this.bytes[14] = (dtsDelta & 0x0000FF00) >>> 8;
this.bytes[15] = (dtsDelta & 0x000000FF) >>> 0;
break;
case _FlvTag.AUDIO_TAG:
this.bytes[11] = 0xAF; // 44 kHz, 16-bit stereo
this.bytes[12] = extraData ? 0x00 : 0x01;
break;
case _FlvTag.METADATA_TAG:
this.position = 11;
this.view.setUint8(this.position, 0x02); // String type
this.position++;
this.view.setUint16(this.position, 0x0A); // 10 Bytes
this.position += 2; // set "onMetaData"
this.bytes.set([0x6f, 0x6e, 0x4d, 0x65, 0x74, 0x61, 0x44, 0x61, 0x74, 0x61], this.position);
this.position += 10;
this.bytes[this.position] = 0x08; // Array type
this.position++;
this.view.setUint32(this.position, adHoc);
this.position = this.length;
this.bytes.set([0, 0, 9], this.position);
this.position += 3; // End Data Tag
this.length = this.position;
break;
}
len = this.length - 11; // write the DataSize field
this.bytes[1] = (len & 0x00FF0000) >>> 16;
this.bytes[2] = (len & 0x0000FF00) >>> 8;
this.bytes[3] = (len & 0x000000FF) >>> 0; // write the Timestamp
this.bytes[4] = (this.dts & 0x00FF0000) >>> 16;
this.bytes[5] = (this.dts & 0x0000FF00) >>> 8;
this.bytes[6] = (this.dts & 0x000000FF) >>> 0;
this.bytes[7] = (this.dts & 0xFF000000) >>> 24; // write the StreamID
this.bytes[8] = 0;
this.bytes[9] = 0;
this.bytes[10] = 0; // Sometimes we're at the end of the view and have one slot to write a
// uint32, so, prepareWrite of count 4, since, view is uint8
prepareWrite(this, 4);
this.view.setUint32(this.length, this.length);
this.length += 4;
this.position += 4; // trim down the byte buffer to what is actually being used
this.bytes = this.bytes.subarray(0, this.length);
this.frameTime = _FlvTag.frameTime(this.bytes); // if bytes.bytelength isn't equal to this.length, handle error
return this;
};
};
_FlvTag.AUDIO_TAG = 0x08; // == 8, :uint
_FlvTag.VIDEO_TAG = 0x09; // == 9, :uint
_FlvTag.METADATA_TAG = 0x12; // == 18, :uint
// (tag:ByteArray):Boolean {
_FlvTag.isAudioFrame = function (tag) {
return _FlvTag.AUDIO_TAG === tag[0];
}; // (tag:ByteArray):Boolean {
_FlvTag.isVideoFrame = function (tag) {
return _FlvTag.VIDEO_TAG === tag[0];
}; // (tag:ByteArray):Boolean {
_FlvTag.isMetaData = function (tag) {
return _FlvTag.METADATA_TAG === tag[0];
}; // (tag:ByteArray):Boolean {
_FlvTag.isKeyFrame = function (tag) {
if (_FlvTag.isVideoFrame(tag)) {
return tag[11] === 0x17;
}
if (_FlvTag.isAudioFrame(tag)) {
return true;
}
if (_FlvTag.isMetaData(tag)) {
return true;
}
return false;
}; // (tag:ByteArray):uint {
_FlvTag.frameTime = function (tag) {
var pts = tag[4] << 16; // :uint
pts |= tag[5] << 8;
pts |= tag[6] << 0;
pts |= tag[7] << 24;
return pts;
};
module.exports = _FlvTag;

View file

@ -0,0 +1,13 @@
"use strict";
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*/
module.exports = {
tag: require('./flv-tag'),
Transmuxer: require('./transmuxer'),
getFlvHeader: require('./flv-header')
};

View file

@ -0,0 +1,30 @@
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*/
'use strict';
var TagList = function TagList() {
var self = this;
this.list = [];
this.push = function (tag) {
this.list.push({
bytes: tag.bytes,
dts: tag.dts,
pts: tag.pts,
keyFrame: tag.keyFrame,
metaDataTag: tag.metaDataTag
});
};
Object.defineProperty(this, 'length', {
get: function get() {
return self.list.length;
}
});
};
module.exports = TagList;

View file

@ -0,0 +1,425 @@
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*/
'use strict';
var Stream = require('../utils/stream.js');
var FlvTag = require('./flv-tag.js');
var m2ts = require('../m2ts/m2ts.js');
var AdtsStream = require('../codecs/adts.js');
var H264Stream = require('../codecs/h264').H264Stream;
var CoalesceStream = require('./coalesce-stream.js');
var TagList = require('./tag-list.js');
var _Transmuxer, _VideoSegmentStream, _AudioSegmentStream, collectTimelineInfo, metaDataTag, extraDataTag;
/**
* Store information about the start and end of the tracka and the
* duration for each frame/sample we process in order to calculate
* the baseMediaDecodeTime
*/
collectTimelineInfo = function collectTimelineInfo(track, data) {
if (typeof data.pts === 'number') {
if (track.timelineStartInfo.pts === undefined) {
track.timelineStartInfo.pts = data.pts;
} else {
track.timelineStartInfo.pts = Math.min(track.timelineStartInfo.pts, data.pts);
}
}
if (typeof data.dts === 'number') {
if (track.timelineStartInfo.dts === undefined) {
track.timelineStartInfo.dts = data.dts;
} else {
track.timelineStartInfo.dts = Math.min(track.timelineStartInfo.dts, data.dts);
}
}
};
metaDataTag = function metaDataTag(track, pts) {
var tag = new FlvTag(FlvTag.METADATA_TAG); // :FlvTag
tag.dts = pts;
tag.pts = pts;
tag.writeMetaDataDouble('videocodecid', 7);
tag.writeMetaDataDouble('width', track.width);
tag.writeMetaDataDouble('height', track.height);
return tag;
};
extraDataTag = function extraDataTag(track, pts) {
var i,
tag = new FlvTag(FlvTag.VIDEO_TAG, true);
tag.dts = pts;
tag.pts = pts;
tag.writeByte(0x01); // version
tag.writeByte(track.profileIdc); // profile
tag.writeByte(track.profileCompatibility); // compatibility
tag.writeByte(track.levelIdc); // level
tag.writeByte(0xFC | 0x03); // reserved (6 bits), NULA length size - 1 (2 bits)
tag.writeByte(0xE0 | 0x01); // reserved (3 bits), num of SPS (5 bits)
tag.writeShort(track.sps[0].length); // data of SPS
tag.writeBytes(track.sps[0]); // SPS
tag.writeByte(track.pps.length); // num of PPS (will there ever be more that 1 PPS?)
for (i = 0; i < track.pps.length; ++i) {
tag.writeShort(track.pps[i].length); // 2 bytes for length of PPS
tag.writeBytes(track.pps[i]); // data of PPS
}
return tag;
};
/**
* Constructs a single-track, media segment from AAC data
* events. The output of this stream can be fed to flash.
*/
_AudioSegmentStream = function AudioSegmentStream(track) {
var adtsFrames = [],
videoKeyFrames = [],
oldExtraData;
_AudioSegmentStream.prototype.init.call(this);
this.push = function (data) {
collectTimelineInfo(track, data);
if (track) {
track.audioobjecttype = data.audioobjecttype;
track.channelcount = data.channelcount;
track.samplerate = data.samplerate;
track.samplingfrequencyindex = data.samplingfrequencyindex;
track.samplesize = data.samplesize;
track.extraData = track.audioobjecttype << 11 | track.samplingfrequencyindex << 7 | track.channelcount << 3;
}
data.pts = Math.round(data.pts / 90);
data.dts = Math.round(data.dts / 90); // buffer audio data until end() is called
adtsFrames.push(data);
};
this.flush = function () {
var currentFrame,
adtsFrame,
lastMetaPts,
tags = new TagList(); // return early if no audio data has been observed
if (adtsFrames.length === 0) {
this.trigger('done', 'AudioSegmentStream');
return;
}
lastMetaPts = -Infinity;
while (adtsFrames.length) {
currentFrame = adtsFrames.shift(); // write out a metadata frame at every video key frame
if (videoKeyFrames.length && currentFrame.pts >= videoKeyFrames[0]) {
lastMetaPts = videoKeyFrames.shift();
this.writeMetaDataTags(tags, lastMetaPts);
} // also write out metadata tags every 1 second so that the decoder
// is re-initialized quickly after seeking into a different
// audio configuration.
if (track.extraData !== oldExtraData || currentFrame.pts - lastMetaPts >= 1000) {
this.writeMetaDataTags(tags, currentFrame.pts);
oldExtraData = track.extraData;
lastMetaPts = currentFrame.pts;
}
adtsFrame = new FlvTag(FlvTag.AUDIO_TAG);
adtsFrame.pts = currentFrame.pts;
adtsFrame.dts = currentFrame.dts;
adtsFrame.writeBytes(currentFrame.data);
tags.push(adtsFrame.finalize());
}
videoKeyFrames.length = 0;
oldExtraData = null;
this.trigger('data', {
track: track,
tags: tags.list
});
this.trigger('done', 'AudioSegmentStream');
};
this.writeMetaDataTags = function (tags, pts) {
var adtsFrame;
adtsFrame = new FlvTag(FlvTag.METADATA_TAG); // For audio, DTS is always the same as PTS. We want to set the DTS
// however so we can compare with video DTS to determine approximate
// packet order
adtsFrame.pts = pts;
adtsFrame.dts = pts; // AAC is always 10
adtsFrame.writeMetaDataDouble('audiocodecid', 10);
adtsFrame.writeMetaDataBoolean('stereo', track.channelcount === 2);
adtsFrame.writeMetaDataDouble('audiosamplerate', track.samplerate); // Is AAC always 16 bit?
adtsFrame.writeMetaDataDouble('audiosamplesize', 16);
tags.push(adtsFrame.finalize());
adtsFrame = new FlvTag(FlvTag.AUDIO_TAG, true); // For audio, DTS is always the same as PTS. We want to set the DTS
// however so we can compare with video DTS to determine approximate
// packet order
adtsFrame.pts = pts;
adtsFrame.dts = pts;
adtsFrame.view.setUint16(adtsFrame.position, track.extraData);
adtsFrame.position += 2;
adtsFrame.length = Math.max(adtsFrame.length, adtsFrame.position);
tags.push(adtsFrame.finalize());
};
this.onVideoKeyFrame = function (pts) {
videoKeyFrames.push(pts);
};
};
_AudioSegmentStream.prototype = new Stream();
/**
* Store FlvTags for the h264 stream
* @param track {object} track metadata configuration
*/
_VideoSegmentStream = function VideoSegmentStream(track) {
var nalUnits = [],
config,
h264Frame;
_VideoSegmentStream.prototype.init.call(this);
this.finishFrame = function (tags, frame) {
if (!frame) {
return;
} // Check if keyframe and the length of tags.
// This makes sure we write metadata on the first frame of a segment.
if (config && track && track.newMetadata && (frame.keyFrame || tags.length === 0)) {
// Push extra data on every IDR frame in case we did a stream change + seek
var metaTag = metaDataTag(config, frame.dts).finalize();
var extraTag = extraDataTag(track, frame.dts).finalize();
metaTag.metaDataTag = extraTag.metaDataTag = true;
tags.push(metaTag);
tags.push(extraTag);
track.newMetadata = false;
this.trigger('keyframe', frame.dts);
}
frame.endNalUnit();
tags.push(frame.finalize());
h264Frame = null;
};
this.push = function (data) {
collectTimelineInfo(track, data);
data.pts = Math.round(data.pts / 90);
data.dts = Math.round(data.dts / 90); // buffer video until flush() is called
nalUnits.push(data);
};
this.flush = function () {
var currentNal,
tags = new TagList(); // Throw away nalUnits at the start of the byte stream until we find
// the first AUD
while (nalUnits.length) {
if (nalUnits[0].nalUnitType === 'access_unit_delimiter_rbsp') {
break;
}
nalUnits.shift();
} // return early if no video data has been observed
if (nalUnits.length === 0) {
this.trigger('done', 'VideoSegmentStream');
return;
}
while (nalUnits.length) {
currentNal = nalUnits.shift(); // record the track config
if (currentNal.nalUnitType === 'seq_parameter_set_rbsp') {
track.newMetadata = true;
config = currentNal.config;
track.width = config.width;
track.height = config.height;
track.sps = [currentNal.data];
track.profileIdc = config.profileIdc;
track.levelIdc = config.levelIdc;
track.profileCompatibility = config.profileCompatibility;
h264Frame.endNalUnit();
} else if (currentNal.nalUnitType === 'pic_parameter_set_rbsp') {
track.newMetadata = true;
track.pps = [currentNal.data];
h264Frame.endNalUnit();
} else if (currentNal.nalUnitType === 'access_unit_delimiter_rbsp') {
if (h264Frame) {
this.finishFrame(tags, h264Frame);
}
h264Frame = new FlvTag(FlvTag.VIDEO_TAG);
h264Frame.pts = currentNal.pts;
h264Frame.dts = currentNal.dts;
} else {
if (currentNal.nalUnitType === 'slice_layer_without_partitioning_rbsp_idr') {
// the current sample is a key frame
h264Frame.keyFrame = true;
}
h264Frame.endNalUnit();
}
h264Frame.startNalUnit();
h264Frame.writeBytes(currentNal.data);
}
if (h264Frame) {
this.finishFrame(tags, h264Frame);
}
this.trigger('data', {
track: track,
tags: tags.list
}); // Continue with the flush process now
this.trigger('done', 'VideoSegmentStream');
};
};
_VideoSegmentStream.prototype = new Stream();
/**
* An object that incrementally transmuxes MPEG2 Trasport Stream
* chunks into an FLV.
*/
_Transmuxer = function Transmuxer(options) {
var self = this,
packetStream,
parseStream,
elementaryStream,
videoTimestampRolloverStream,
audioTimestampRolloverStream,
timedMetadataTimestampRolloverStream,
adtsStream,
h264Stream,
videoSegmentStream,
audioSegmentStream,
captionStream,
coalesceStream;
_Transmuxer.prototype.init.call(this);
options = options || {}; // expose the metadata stream
this.metadataStream = new m2ts.MetadataStream();
options.metadataStream = this.metadataStream; // set up the parsing pipeline
packetStream = new m2ts.TransportPacketStream();
parseStream = new m2ts.TransportParseStream();
elementaryStream = new m2ts.ElementaryStream();
videoTimestampRolloverStream = new m2ts.TimestampRolloverStream('video');
audioTimestampRolloverStream = new m2ts.TimestampRolloverStream('audio');
timedMetadataTimestampRolloverStream = new m2ts.TimestampRolloverStream('timed-metadata');
adtsStream = new AdtsStream();
h264Stream = new H264Stream();
coalesceStream = new CoalesceStream(options); // disassemble MPEG2-TS packets into elementary streams
packetStream.pipe(parseStream).pipe(elementaryStream); // !!THIS ORDER IS IMPORTANT!!
// demux the streams
elementaryStream.pipe(videoTimestampRolloverStream).pipe(h264Stream);
elementaryStream.pipe(audioTimestampRolloverStream).pipe(adtsStream);
elementaryStream.pipe(timedMetadataTimestampRolloverStream).pipe(this.metadataStream).pipe(coalesceStream); // if CEA-708 parsing is available, hook up a caption stream
captionStream = new m2ts.CaptionStream(options);
h264Stream.pipe(captionStream).pipe(coalesceStream); // hook up the segment streams once track metadata is delivered
elementaryStream.on('data', function (data) {
var i, videoTrack, audioTrack;
if (data.type === 'metadata') {
i = data.tracks.length; // scan the tracks listed in the metadata
while (i--) {
if (data.tracks[i].type === 'video') {
videoTrack = data.tracks[i];
} else if (data.tracks[i].type === 'audio') {
audioTrack = data.tracks[i];
}
} // hook up the video segment stream to the first track with h264 data
if (videoTrack && !videoSegmentStream) {
coalesceStream.numberOfTracks++;
videoSegmentStream = new _VideoSegmentStream(videoTrack); // Set up the final part of the video pipeline
h264Stream.pipe(videoSegmentStream).pipe(coalesceStream);
}
if (audioTrack && !audioSegmentStream) {
// hook up the audio segment stream to the first track with aac data
coalesceStream.numberOfTracks++;
audioSegmentStream = new _AudioSegmentStream(audioTrack); // Set up the final part of the audio pipeline
adtsStream.pipe(audioSegmentStream).pipe(coalesceStream);
if (videoSegmentStream) {
videoSegmentStream.on('keyframe', audioSegmentStream.onVideoKeyFrame);
}
}
}
}); // feed incoming data to the front of the parsing pipeline
this.push = function (data) {
packetStream.push(data);
}; // flush any buffered data
this.flush = function () {
// Start at the top of the pipeline and flush all pending work
packetStream.flush();
}; // Caption data has to be reset when seeking outside buffered range
this.resetCaptions = function () {
captionStream.reset();
}; // Re-emit any data coming from the coalesce stream to the outside world
coalesceStream.on('data', function (event) {
self.trigger('data', event);
}); // Let the consumer know we have finished flushing the entire pipeline
coalesceStream.on('done', function () {
self.trigger('done');
});
};
_Transmuxer.prototype = new Stream(); // forward compatibility
module.exports = _Transmuxer;

View file

@ -0,0 +1,20 @@
/**
* mux.js
*
* Copyright (c) Brightcove
* Licensed Apache-2.0 https://github.com/videojs/mux.js/blob/master/LICENSE
*/
'use strict';
var muxjs = {
codecs: require('./codecs'),
mp4: require('./mp4'),
flv: require('./flv'),
mp2t: require('./m2ts'),
partial: require('./partial')
}; // include all the tools when the full library is required
muxjs.mp4.tools = require('./tools/mp4-inspector');
muxjs.flv.tools = require('./tools/flv-inspector');
muxjs.mp2t.tools = require('./tools/ts-inspector');
module.exports = muxjs;

Some files were not shown because too many files have changed in this diff Show more