Displaying 1 to 20 from 22 results

HaishinKit.swift - Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.

  •    Swift

Please set up your project Swift 4.0. Make sure you setup and activate your AVAudioSession.

digital_video_introduction - A hands-on introduction to video technology: image, video, codec (av1, vp9, h265) and more (ffmpeg encoding)

  •    Jupyter

A gentle introduction to video technology, although it's aimed at software developers / engineers, we want to make it easy for anyone to learn. This idea was born during a mini workshop for newcomers to video technology. The goal is to introduce some digital video concepts with a simple vocabulary, lots of visual elements and practical examples when possible, and make this knowledge available everywhere. Please, feel free to send corrections, suggestions and improve it.

KSYLive_iOS - 金山云直播SDK [ iOS推流+播放 ]融合版 支持美颜滤镜(Beauty Filter)、美声(Beauty Voice)、软硬编(Software/Hardware Encoder) 、网络自适应(Network Auto Adapt)、混音(Audio Mixer)、混响(Reverb)、画中画(PIP)

  •    Objective-C

金山云直播SDK [ iOS推流+播放 ]融合版 支持美颜滤镜(Beauty Filter)、美声(Beauty Voice)、软硬编(Software/Hardware Encoder) 、网络自适应(Network Auto Adapt)、混音(Audio Mixer)、混响(Reverb)、画中画(PIP)




mux.js - Lightweight utilities for inspecting and manipulating video container formats.

  •    Javascript

Lightweight utilities for inspecting and manipulating video container formats. The MP4 inspector is used extensively as a debugging tool for the transmuxer. You can see it in action by cloning the project and opening the debug page in your browser.


ffmediaelement - WPF MediaElement replacement based on FFmpeg

  •    CSharp

FFME is a close (and I'd like to think better) drop-in replacement for Microsoft's WPF MediaElement Control. While the standard MediaElement uses DirectX (DirectShow) for media playback, FFME uses FFmpeg to read and decode audio and video. This means that for those of you who want to support stuff like HLS playback, or just don't want to go through the hassle of installing codecs on client machines, using FFME might just be the answer.First off, let's review a few concepts. A packet is a group of bytes read from the input. All packets are of a specific MediaType (Audio, Video, Subtitle, Data), and contain some timing information and most importantly compressed data. Packets are sent to a Codec and in turn, the codec produces Frames. Please note that producing 1 frome does not always take exactly 1 packet. A packet may contain many frames but also a frame may require several packets for the decoder to build it. Frames will contain timing informattion and the raw, uncompressed data. Now, you may think you can use frames and show pixels on the screen or data to the sound card. We are close, but we still need to do some additional processing. Turns out different Codecs will produce different uncompressed data formats. For example, some video codecs will output pixel data in ARGB, some others in RGB, and some other in YUV420. Therefore, we will need to Convert these frames into something all hardware can use. I call these converted frames, MediaBlocks. These MediaBlocks will contain uncompressed data in standard Audio and Video formats.

node-drone-video - Dump AR

  •    Javascript

Save down AR.Drone video streams to your filesystem with associated navdata video streams. Splicing and editing the raw stream and navdata stream is left up to you to do in Final Cut or whatever your preferred video editor is.Once installation is complete, you can begin recording video output from the AR.Drone. First connect to your drone's WiFi hotspot (for example: ardrone2_058438).

video-stream.js - :soon: :vhs: Video stream middleware for express.js

  •    Javascript

Express middleware for streaming video. Call videoStream middleware passing the directory that we should look for videos.

ng-media - AngularJS support for HTML5 media elements

  •    Javascript

ng-media provides a simple, declarative means for using HTML5 audio and video elements. Pull requests, bug reports and suggestions are quite welcome.

AndroidScreenCaster - A live android screen caster which encoding media by h264,webm via TCP and UDP with low latency

  •    Java

I'm currently in charge of test automation team. We try to make possible functional testing for mobile games. While we're working on it, we needed to mirror live android screen to web browser. The first approach was MJPEG. We captured entire screen and sent it over network in every very short period. Surely, it was ineffiecient, slow and huge. The first approach was helpful anyway to prove our concept of system, though. The second approach was encoding our media data by using well known codecs such as h264 and vp8. It ended up a success anyway. However, it was hard to find code examples. I mostly refer to android googlesource(specially media test cases). I hope this project helps you to save your time and understand concept of live screen casting on Android.

jmuxer - jMuxer - a simple javascript mp4 muxer for non-standard streaming communications protocol

  •    Javascript

jMuxer - a simple javascript mp4 muxer for non-standard streaming communications protocol. Basically it does not care about communication protocol and it is simply a javscript mp4 remuxer intended to pay media file in the browser using media source extension. It expects raw H264 video data and AAC audio data in ADTS container as an input. It was needed to play raw H264 and AAC data coming from live streaming encapsulated into a custom transport container in a project. Each chunk would contain its duration, audio data and video data with simple 4 bytes header. Please check example section to check packet format. After struggling several days with few open source projects like hls.js, I have eneded up to make a new one that would be more simpler and minimalist to achieve my goal.

coconutjs - NodeJS client Library for encoding Videos with Coconut

  •    Javascript

Use the API Request Builder to generate a config file that match your specific workflow. Note that you can use the environment variable COCONUT_API_KEY to set your API key.

opendlv - OpenDLV - A modern microservice-based software ecosystem for self-driving vehicles.

  •    CMake

Applications based on OpenDLV are grouped in UDP multicast sessions belonging to IPv4 address 225.0.0.X, where X is from the within the range [1,254]. All microservices belonging to the same UDP multicast group are able to communicate with each other; thus, two applications running in different UDP multicast sessions do not see each other and are completely separated. The actual UDP multicast session is selected using the commandline parameter --cid=111, where 111 would define the UDP multicast address 225.0.0.111. Microservices exchange data using the message Envelope that contains besides the actual message to send further meta information like sent and received timestamp and the point in time when the contained message was actually sampled. All messages are encoded in Google's Protobuf data format (example) that has been adjusted to preserve forwards and backwards compatibility using libcluon's native implementation of Protobuf.

minih264 - Minimalistic H264/SVC encoder single header library

  •    C

Small, but yet reasonably fast H264/SVC encoder single-header library with SSE/NEON optimizations. Decoder can be popped up in future. Disclaimer: code highly experimental.