GStreamer can now be used on Android and iOS.
On Android, the NDK cross-toolchain (ndk-build) is used to build GStreamer and JNI bindings are added. Android has support for shared libraries, but it’s limited. Android natively provides OpenGL ES for video and OpenSL ES for audio. There’s also a MediaCodec Java API that maps to hardware-specific codecs. There’s also a Camera Java API that can be used as a camera source but not currently integrated in GStreamer.
In iOS, because it is Objective C, no bindings are needed and the C API can be used directly. Apps are completely sandboxed, you can’t access any files. To deploy and test, you have to have a paid account on the App store. You can only develop on OS-X, and you can only really test on devices because the emulator doesn’t implement the complete API. Bottom line: it’s quite expensive. iOS also has OpenGL ES, it has CoreAudio as an audio sink, but it doesn’t have a hardware-accelerated video codec API. AVFoundation does give you the functionality of uridecodebin, but obviously only for the formats supported by iOS.
The lack of shared libraries on both platforms makes life difficult for plugins, so only static plugins are supported. That means the app has to explicitly link in the required plugins. Because of LGPL, that means GStreamer users should be GPL.
For the video sink, the eglesvideosink is ported to both platforms. It works like any other video sink, but you have to provide it with a GstVideoOverlay to which it can render.
For audio: on Android, openslaudiosink and src are available. It uses Android-specific API extensions of GStreamer.
amcvideodec and amcaudiodec use the MediaCodec Java API to use hardware-accelerated decoding. the JNI doesn’t give noticable overhead. The only problem with it is that colorspaces are not clearly defined. The supported video codecs depend on the device, this list can be found in /system/etc/mediacodecs.xml.
androidcamerasrc doesn’t exist yet, but should be easy to do.
For iOS: avfvideosrc exists and produces raw yuv in a few discrete formats and resolution.
iosavassetsrc reads from an iOS asset (ipod-library, file, hls), demuxes it and decodes it. If you need some other stream, you could transmux it to a file or to hls and get it from there.
To build GStreamer, cerbero is used. The same build system is used on all platforms (Windows, Linux, OS-X; hence the 3-headed dog). It reuses the autotools/cmake packaging system. cerbero downloads the source, configures it and builds it. It also downloads toolchains and other dependencies. This has been extended with cross-build configuration files for Android and iOS.
On Android, GStreamer and the selected plugins should be built and linked into a single shared library, libgstreamer_android.so. This is used by an app library (libapp.so) that provides Java bindings and uses the gstreamer library. See the slides for details how to write the .mk files. A gstreamer_android.c file is generated to register the static plugins and doing a bit of Android boilerplate. For linking, normally libtool is used to parse .la files, but this isn’t portable to Windows, so a smaller replacement was written based on make and sed. The GStreamer project provides tarballs with the headers, .a and .la and .mk files to make all this work.
In iOS, all the plugins are linked in, but the registration function uses #ifdefs and the strip step removes the unused archives and symbols. You uncomment the #define of the plugins you want. On iOS, a DMG image file is distributed, which contains the package and some templates for XCode.
There is currently only one Android media player app that uses this stuff, and it’s still in alpha stage so not publicly available from the market.