Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

timed metadata support? #254

Open
alew3 opened this issue Aug 2, 2021 · 2 comments
Open

timed metadata support? #254

alew3 opened this issue Aug 2, 2021 · 2 comments

Comments

@alew3
Copy link

alew3 commented Aug 2, 2021

Is there any support for the media server to inject metadata into the stream, similar to what AWS IVS does?
https://docs.aws.amazon.com/ivs/latest/userguide/metadata.html

This is import to sync streaming with the user interface, as there is a 20s delay in streaming.

@yondonfu
Copy link
Member

Hi @alew3!

LPMS currently does not support injecting metadata into the stream. I believe AWS IVS leverages ID3 metadata tags in HLS in order to inject metadata into the stream. I suspect it would be possible to support such a workflow in LPMS, but it may make more sense to handle injection of timed metadata outside of LPMS depending on what you're trying to accomplish and the stack that you would like to use.

Could you share more about the desired architecture for what you're looking to build and how you see LPMS and/or other Livepeer tools (whether it be the node or the hosted livepeer.com API) fitting into that architecture if timed metadata is supported?

@alew3
Copy link
Author

alew3 commented Aug 24, 2021

The way AWS IVS does it is very nice. They have an ingest RTMPS server that does the encoding and an API call to send data.

The metadata API is called with the stream_id and desired metadata to pass along. It considers the call time of the API to be the desired sync time to "mix" the video with the metadata.

Somehow this is encoded together with the video stream and the client javascript player triggers an event whenever it receives any metadata within the video. This way, the stream and the application can be "synched" for a better user experience.

I haven't looked under the hood how it actually works, but it seems to be supported in many different ways such as the HLS standard ID3 tag https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/HTTP_Live_Streaming_Metadata_Spec/Introduction/Introduction.html

For some reason, this very useful feature is hard to find in live stream solutions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants