master
Code
Sign In Required
Please sign in to use Codespaces.
Launching GitHub Desktop
If nothing happens, download GitHub Desktop and try again.
Launching GitHub Desktop
If nothing happens, download GitHub Desktop and try again.
Launching Xcode
If nothing happens, download Xcode and try again.
Launching Visual Studio Code
Your codespace will open once ready.
There was a problem preparing your codespace, please try again.
Latest commit
Git stats
Files
use jsmpeg.js to decode mpeg1 video, use broadway js to decode h264 video, the video stream transported by websocket, and the server wrote by golang.
just run websocketvideostream.exe
then,u can see video here:
H.264-AVC-ISO_IEC_14496-10-2012.pdf For MPEG-4 H.264 transcoders that deliver I-frame, P-frame, and B-frame NALUs inside an MPEG-2 transport, the resulting packetized elementary streams (PES) are timestamped with presentation time stamps (PTS) and decoder timestamps (DTS) in time units of 1/90000 of a second.
The NALUs come in DTS timestamp order in a repeating pattern like
I P B B B P B B B ...
where the intended playback rendering is
I B B B P B B B P ... (This transport strategy ensures that both frames that the B-frame bridges are in the decoder before the B-frame is processed.)
For FLV, the Timestamp (FLV spec p.69) tells when the frame should be fed to the decoder in milliseconds, which is
timestamp = DTS / 90.0 The CompositionTime (FLV spec p.72) tells the renderer when to perform ("compose") the video frame on the display device in milliseconds after it enters the decoder; thus it is
compositionTime = (PTS - DTS) / 90.0 (Because the PTS >= DTS, this delta is never negative.)