VideoContext
VideoContext.
Static Member Summary
Static Public Members | ||
public static get |
DEFINITIONS: * |
Constructor Summary
Public Constructor | ||
public |
constructor(canvas: Canvas, initErrorCallback: function, options: Object) Initialise the VideoContext and render to the specific canvas. |
Member Summary
Public Members | ||
public get |
currentTime: number: * Get how far through the internal timeline has been played. |
|
public set |
currentTime(currentTime: number) Set the progress through the internal timeline. |
|
public get |
Get the final node in the render graph which represents the canvas to display content on to. |
|
public get |
Get the time at which the last node in the current internal timeline finishes playing. |
|
public get |
element: HTMLElement: * Get the canvas that the VideoContext is using. |
|
public |
|
|
public get |
id: * Reurns an ID assigned to the VideoContext instance. |
|
public set |
Set the ID of the VideoContext instance. |
|
public set |
playbackRate(rate: number) Set the playback rate of the VideoContext instance. |
|
public get |
playbackRate: number: * Return the current playbackRate of the video context. |
|
public get |
Get the current state. |
|
public set |
Set the volume of all VideoNode's created in the VideoContext. |
|
public get |
Return the current volume of the video context. |
Private Members | ||
private |
_callbacks: * |
|
private |
_canvas: * |
|
private |
|
|
private |
|
|
private |
|
|
private |
_gl: * |
|
private |
_id: * |
|
private |
|
|
private |
_processingNodes: *[] |
|
private |
_renderGraph: * |
|
private |
_sourceNodes: *[] |
|
private |
|
|
private |
_state: * |
|
private |
_timeline: *[] |
|
private |
_timelineCallbacks: *[] |
|
private |
|
|
private |
|
|
private |
|
Method Summary
Public Methods | ||
public |
|
|
public |
canvas(src: Canvas): CanvasNode Create a new node representing a canvas source |
|
public |
compositor(definition: Object): CompositingNode Create a new compositiing node. |
|
public |
createCanvasSourceNode(canvas: *, sourceOffset: number, preloadTime: number): * |
|
public |
createCompositingNode(definition: *): * |
|
public |
createEffectNode(definition: *): * |
|
public |
createImageSourceNode(src: *, sourceOffset: number, preloadTime: number, imageElementAttributes: {}): * |
|
public |
createTransitionNode(definition: *): * |
|
public |
createVideoSourceNode(src: *, sourceOffset: number, preloadTime: number, videoElementAttributes: {}): * |
|
public |
effect(definition: Object): EffectNode Create a new effect node. |
|
public |
Create a new node representing an image source |
|
public |
Pause playback of the VideoContext |
|
public |
Start the VideoContext playing |
|
public |
registerCallback(type: String, func: Function): boolean Regsiter a callback to listen to one of the following events: "stalled", "update", "ended", "content", "nocontent" |
|
public |
registerTimelineCallback(time: number, func: Function, ordering: number) Register a callback to happen at a specific point in time. |
|
public |
reset() Destroy all nodes in the graph and reset the timeline. |
|
public |
snapshot(): * Get a JS Object containing the state of the VideoContext instance and all the created nodes. |
|
public |
transition(definition: Object): TransitionNode Create a new transition node. |
|
public |
unregisterCallback(func: Function): boolean Remove a previously registed callback |
|
public |
Unregister a callback which happens at a specific point in time. |
|
public |
This allows manual calling of the update loop of the videoContext. |
|
public |
Create a new node representing a video source |
Private Methods | ||
private |
_callCallbacks(type: *) |
|
private |
_depricate(msg: *) |
|
private |
|
|
private |
_update(dt: *) |
Static Public Members
public static get DEFINITIONS: * source
Public Constructors
public constructor(canvas: Canvas, initErrorCallback: function, options: Object) source
Initialise the VideoContext and render to the specific canvas. A 2nd parameter can be passed to the constructor which is a function that get's called if the VideoContext fails to initialise.
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement, function(){console.error("Sorry, your browser dosen\'t support WebGL");});
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(10);
ctx.play();
Public Members
public get currentTime: number: * source
Get how far through the internal timeline has been played.
Getting this value will give the current playhead position. Can be used for updating timelines.
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(10);
ctx.play();
setTimeout(function(){console.log(ctx.currentTime);},1000); //should print roughly 1.0
public set currentTime(currentTime: number) source
Set the progress through the internal timeline. Setting this can be used as a way to implement a scrubaable timeline.
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(20);
ctx.currentTime = 10; // seek 10 seconds in
ctx.play();
public get destination: DestinationNode: * source
Get the final node in the render graph which represents the canvas to display content on to.
This proprety is read-only and there can only ever be one destination node. Other nodes can connect to this but you cannot connect this node to anything.
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.start(0);
videoNode.stop(10);
videoNode.connect(ctx.destination);
public get duration: number: * source
Get the time at which the last node in the current internal timeline finishes playing.
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
console.log(ctx.duration); //prints 0
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(10);
console.log(ctx.duration); //prints 10
ctx.play();
public get element: HTMLElement: * source
Get the canvas that the VideoContext is using.
Return:
HTMLElement | The canvas that the VideoContext is using. |
public get id: * source
Reurns an ID assigned to the VideoContext instance. This will either be the same id as the underlying canvas element, or a uniquley generated one.
public set playbackRate(rate: number) source
Set the playback rate of the VideoContext instance. This will alter the playback speed of all media elements played through the VideoContext.
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.start(0);
videoNode.stop(10);
videoNode.connect(ctx.destination);
ctx.playbackRate = 2;
ctx.play(); // Double playback rate means this will finish playing in 5 seconds.
public get state: number: * source
Get the current state.
This will be either
- VideoContext.STATE.PLAYING: current sources on timeline are active
- VideoContext.STATE.PAUSED: all sources are paused
- VideoContext.STATE.STALLED: one or more sources is unable to play
- VideoContext.STATE.ENDED: all sources have finished playing
- VideoContext.STATE.BROKEN: the render graph is in a broken state
public set volume(volume: number) source
Set the volume of all VideoNode's created in the VideoContext.
Private Members
private _callbacks: * source
private _canvas: * source
private _destinationNode: * source
private _endOnLastSourceEnd: * source
private _gl: * source
private _id: * source
private _processingNodes: *[] source
private _renderGraph: * source
private _sourceNodes: *[] source
private _sourcesPlaying: * source
private _state: * source
private _timeline: *[] source
private _timelineCallbacks: *[] source
private _useVideoElementCache: * source
private _videoElementCache: * source
Public Methods
public audio(src: *, sourceOffset: number, preloadTime: number, audioElementAttributes: {}): * source
Return:
* |
public canvas(src: Canvas): CanvasNode source
Create a new node representing a canvas source
Params:
Name | Type | Attribute | Description |
src | Canvas | The canvas element to create the canvas node from. |
public compositor(definition: Object): CompositingNode source
Create a new compositiing node.
Compositing nodes are used for operations such as combining multiple video sources into a single track/connection for further processing in the graph.
A compositing node is slightly different to other processing nodes in that it only has one input in it's definition but can have unlimited connections made to it. The shader in the definition is run for each input in turn, drawing them to the output buffer. This means there can be no interaction between the spearte inputs to a compositing node, as they are individually processed in seperate shader passes.
Params:
Name | Type | Attribute | Description |
definition | Object | this is an object defining the shaders, inputs, and properties of the compositing node to create. Builtin definitions can be found by accessing VideoContext.DEFINITIONS |
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
//A simple compositing node definition which just renders all the inputs to the output buffer.
var combineDefinition = {
vertexShader : "\
attribute vec2 a_position;\
attribute vec2 a_texCoord;\
varying vec2 v_texCoord;\
void main() {\
gl_Position = vec4(vec2(2.0,2.0)*vec2(1.0, 1.0), 0.0, 1.0);\
v_texCoord = a_texCoord;\
}",
fragmentShader : "\
precision mediump float;\
uniform sampler2D u_image;\
uniform float a;\
varying vec2 v_texCoord;\
varying float v_progress;\
void main(){\
vec4 color = texture2D(u_image, v_texCoord);\
gl_FragColor = color;\
}",
properties:{
"a":{type:"uniform", value:0.0},
},
inputs:["u_image"]
};
//Create the node, passing in the definition.
var trackNode = videoCtx.compositor(combineDefinition);
//create two videos which will play at back to back
var videoNode1 = ctx.video("video1.mp4");
videoNode1.play(0);
videoNode1.stop(10);
var videoNode2 = ctx.video("video2.mp4");
videoNode2.play(10);
videoNode2.stop(20);
//Connect the nodes to the combine node. This will give a single connection representing the two videos which can
//be connected to other effects such as LUTs, chromakeyers, etc.
videoNode1.connect(trackNode);
videoNode2.connect(trackNode);
//Don't do anything exciting, just connect it to the output.
trackNode.connect(ctx.destination);
public createCanvasSourceNode(canvas: *, sourceOffset: number, preloadTime: number): * source
Return:
* |
public createCompositingNode(definition: *): * source
Params:
Name | Type | Attribute | Description |
definition | * |
Return:
* |
public createEffectNode(definition: *): * source
Params:
Name | Type | Attribute | Description |
definition | * |
Return:
* |
public createImageSourceNode(src: *, sourceOffset: number, preloadTime: number, imageElementAttributes: {}): * source
Return:
* |
public createTransitionNode(definition: *): * source
Params:
Name | Type | Attribute | Description |
definition | * |
Return:
* |
public createVideoSourceNode(src: *, sourceOffset: number, preloadTime: number, videoElementAttributes: {}): * source
Return:
* |
public effect(definition: Object): EffectNode source
Create a new effect node.
Params:
Name | Type | Attribute | Description |
definition | Object | this is an object defining the shaders, inputs, and properties of the compositing node to create. Builtin definitions can be found by accessing VideoContext.DEFINITIONS. |
public image(src: string | Image, preloadTime: number, imageElementAttributes: Object): ImageNode source
Create a new node representing an image source
Params:
Name | Type | Attribute | Description |
src | string | Image | The url or image element to create the image node from. |
|
preloadTime | number |
|
How long before a node is to be displayed to attmept to load it. |
imageElementAttributes | Object |
|
Any attributes to be given to the underlying image element. |
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var imageNode = ctx.image("image.png");
var canvasElement = document.getElementById("canvas");
var imageElement = document.getElementById("image");
var ctx = new VideoContext(canvasElement);
var imageNode = ctx.image(imageElement);
public pause(): boolean source
Pause playback of the VideoContext
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(20);
ctx.currentTime = 10; // seek 10 seconds in
ctx.play();
setTimeout(function(){ctx.pause();}, 1000); //pause playback after roughly one second.
public play(): boolean source
Start the VideoContext playing
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(10);
ctx.play();
public registerCallback(type: String, func: Function): boolean source
Regsiter a callback to listen to one of the following events: "stalled", "update", "ended", "content", "nocontent"
"stalled" happend anytime playback is stopped due to unavailbale data for playing assets (i.e video still loading) . "update" is called any time a frame is rendered to the screen. "ended" is called once plackback has finished (i.e ctx.currentTime == ctx.duration). "content" is called a the start of a time region where there is content playing out of one or more sourceNodes. "nocontent" is called at the start of any time region where the VideoContext is still playing, but there are currently no activly playing soureces.
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
ctx.registerCallback("stalled", function(){console.log("Playback stalled");});
ctx.registerCallback("update", function(){console.log("new frame");});
ctx.registerCallback("ended", function(){console.log("Playback ended");});
public registerTimelineCallback(time: number, func: Function, ordering: number) source
Register a callback to happen at a specific point in time.
public reset() source
Destroy all nodes in the graph and reset the timeline. After calling this any created nodes will be unusable.
public snapshot(): * source
Get a JS Object containing the state of the VideoContext instance and all the created nodes.
Return:
* |
public transition(definition: Object): TransitionNode source
Create a new transition node.
Transistion nodes are a type of effect node which have parameters which can be changed as events on the timeline.
For example a transition node which cross-fades between two videos could have a "mix" property which sets the progress through the transistion. Rather than having to write your own code to adjust this property at specfic points in time a transition node has a "transition" function which takes a startTime, stopTime, targetValue, and a propertyName (which will be "mix"). This will linearly interpolate the property from the curernt value to tragetValue between the startTime and stopTime.
Params:
Name | Type | Attribute | Description |
definition | Object | this is an object defining the shaders, inputs, and properties of the transition node to create. |
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
//A simple cross-fade node definition which cross-fades between two videos based on the mix property.
var crossfadeDefinition = {
vertexShader : "\
attribute vec2 a_position;\
attribute vec2 a_texCoord;\
varying vec2 v_texCoord;\
void main() {\
gl_Position = vec4(vec2(2.0,2.0)*a_position-vec2(1.0, 1.0), 0.0, 1.0);\
v_texCoord = a_texCoord;\
}",
fragmentShader : "\
precision mediump float;\
uniform sampler2D u_image_a;\
uniform sampler2D u_image_b;\
uniform float mix;\
varying vec2 v_texCoord;\
varying float v_mix;\
void main(){\
vec4 color_a = texture2D(u_image_a, v_texCoord);\
vec4 color_b = texture2D(u_image_b, v_texCoord);\
color_a[0] *= mix;\
color_a[1] *= mix;\
color_a[2] *= mix;\
color_a[3] *= mix;\
color_b[0] *= (1.0 - mix);\
color_b[1] *= (1.0 - mix);\
color_b[2] *= (1.0 - mix);\
color_b[3] *= (1.0 - mix);\
gl_FragColor = color_a + color_b;\
}",
properties:{
"mix":{type:"uniform", value:0.0},
},
inputs:["u_image_a","u_image_b"]
};
//Create the node, passing in the definition.
var transitionNode = videoCtx.transition(crossfadeDefinition);
//create two videos which will overlap by two seconds
var videoNode1 = ctx.video("video1.mp4");
videoNode1.play(0);
videoNode1.stop(10);
var videoNode2 = ctx.video("video2.mp4");
videoNode2.play(8);
videoNode2.stop(18);
//Connect the nodes to the transistion node.
videoNode1.connect(transitionNode);
videoNode2.connect(transitionNode);
//Set-up a transition which happens at the crossover point of the playback of the two videos
transitionNode.transition(8,10,1.0,"mix");
//Connect the transition node to the output
transitionNode.connect(ctx.destination);
//start playback
ctx.play();
public unregisterCallback(func: Function): boolean source
Remove a previously registed callback
Params:
Name | Type | Attribute | Description |
func | Function | the callback to remove. |
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
//the callback
var updateCallback = function(){console.log("new frame")};
//register the callback
ctx.registerCallback("update", updateCallback);
//then unregister it
ctx.unregisterCallback(updateCallback);
public unregisterTimelineCallback(func: Function) source
Unregister a callback which happens at a specific point in time.
Params:
Name | Type | Attribute | Description |
func | Function | the callback to unregister. |
public update(dt: Number) source
This allows manual calling of the update loop of the videoContext.
Params:
Name | Type | Attribute | Description |
dt | Number | The difference in seconds between this and the previous calling of update. |
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement, undefined, {"manualUpdate" : true});
var previousTime;
function update(time){
if (previousTime === undefined) previousTime = time;
var dt = (time - previousTime)/1000;
ctx.update(dt);
previousTime = time;
requestAnimationFrame(update);
}
update();
public video(-: string | Video): VideoNode source
Create a new node representing a video source
Params:
Name | Type | Attribute | Description |
- | string | Video | The URL or video element to create the video from. |
Example:
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
var canvasElement = document.getElementById("canvas");
var videoElement = document.getElementById("video");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video(videoElement);
Private Methods
private _callCallbacks(type: *) source
Params:
Name | Type | Attribute | Description |
type | * |
private _depricate(msg: *) source
Params:
Name | Type | Attribute | Description |
msg | * |
private _update(dt: *) source
Params:
Name | Type | Attribute | Description |
dt | * |