For the last week, I have been experimenting with streaming video from/through the foxbox. I don’t have anything to show for it yet, but it got me thinking about the best way to expose this through the usual REST (or native) API.
Here’s a quick braindump.
Streams are kinds of Value
. For instance, an IP Camera exposing a live stream could offer a channel implementing camera/live-stream-webrtc
(although for a prototype, this will more likely be camera/live-stream-html5
as this should be simpler to implement). Additional channels can be provided, e.g. camera/replay-stream-webrtc
, which will take arguments to determine the start of the replay.
Stream support at least the following methods:
-
start
; -
pause
; -
drop
.
The Taxonomy Router is in charge of allocating a temporary URL for a Stream returned from an adapter and calling drop
once the client is disconnected. I’m not sure how to detect disconnection yet. I’m not sure how to detect/implement a pause
request, either.
Once the stream has received start
and its temporary URL, it is in charge of providing a html5/webrtc video feed. Behind-the-scenes, this will be implemented through gstreamer at least for the prototype.
Any thoughts?
Cc @fabrice, @dhylands, @azasypkin, @aosmond.
edit Fixed a typo. This was live-stream
, not native-stream
.