<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <html> <head> <title>OpenMAX AL for Android</title> </head> <body> <h1>OpenMAX AL for Android</h1> This article describes the Android native multimedia APIs based on the Khronos Group OpenMAX AL™ 1.0.1 standard, as of Android API level 14 (Android platform version 4.0) and higher. <p> OpenMAX AL is a companion API to OpenSL ES, but for multimedia (video and audio) rather than audio only. <p> Android 4.0 provides a direct, efficient path for low-level streaming multimedia. The new path is ideal for applications that need to maintain complete control over media data before passing it to the platform for presentation. For example, media applications can now retrieve data from any source, apply proprietary encryption/decryption, and then send the data to the platform for display. <p> Applications can now send processed data to the platform as a multiplexed stream of audio/video content in MPEG-2 transport stream format. The platform de-muxes, decodes, and renders the content. The audio track is rendered to the active audio device, while the video track is rendered to either a Surface or a SurfaceTexture. When rendering to a SurfaceTexture, the application can apply subsequent graphics effects to each frame using OpenGL. <p> OpenMAX AL provides a C language interface that is also callable from C++, and exposes features similar to these Android APIs callable from Java programming language code: <ul> <li><a href="http://developer.android.com/reference/android/media/MediaPlayer.html"> android.media.MediaPlayer</a> </ul> As with all of the Android Native Development Kit (NDK), the primary purpose of OpenMAX AL for Android is to facilitate the implementation of shared libraries to be called from Java programming language code via Java Native Interface (JNI). NDK is not intended for writing pure C/C++ applications. <p> Note: though based on OpenMAX AL, the Android native multimedia API is <i>not</i> a conforming implementation of either OpenMAX AL 1.0.1 profile (media player or media player / recorder). This is because Android does not implement all of the features required by either of the profiles. Any known cases where Android behaves differently than the specification are described in section "Android extensions" below. The Android OpenMAX AL implementation has limited features, and is intended primarily for certain performance-sensitive native streaming multimedia applications such as video players. <p> The major feature is the ability to play an MPEG-2 transport stream containing a single program stream made up of one H.264 video elementary stream and one AAC audio elementary stream. The application provides the stream via an Android buffer queue data source, which is based on the OpenSL ES buffer queue concept and Android-specific extensions. <p> The video sink is an <code>ANativeWindow *</code> abstract handle, derived from an <code>android.view.Surface</code> ("surface"). A Surface from <code>SurfaceHolder.getSurface()</code> should be used when displaying an unaltered video within a fixed SurfaceView frame. A Surface from <code>new Surface(SurfaceTexture)</code> allows streaming the decoded video frames to an OpenGL ES 2.0 texture, where the frames can be used as input to a shader algorithm in the Graphics Processing Unit (GPU). Be sure to <code>release()</code> the Surface as soon as possible after calling <code>setSurface</code> or ANativeWindow_fromSurface. <p> The audio sink is always an output mix, a device-independent mixer object similar to that of OpenSL ES. <h2>Getting started</h2> <h3>Example code</h3> <h4>Recommended</h4> Supported and tested example code, usable as a model for your own code, is located in NDK folder <code>platforms/android-14/samples/native-media/</code>. <h4>Not recommended</h4> The OpenMAX AL 1.0.1 specification contains example code in the appendices (see section "References" below for the link to this specification). However, the examples in Appendix D: Sample Code use features not supported by Android. Some examples also contain typographical errors, or use APIs that are likely to change. Proceed with caution in referring to these; though the code may be helpful in understanding the full OpenMAX AL standard, it should not be used as is with Android. <h3>Adding OpenMAX AL to your application source code</h3> OpenMAX AL is a C API, but is callable from both C and C++ code. <p> Add the following lines to your code: <pre> #include <OMXAL/OpenMAXAL.h> #include <OMXAL/OpenMAXAL_Android.h> </pre> <h3>Makefile</h3> Modify your Android.mk as follows: <pre> LOCAL_LDLIBS += libOpenMAXAL </pre> <h3>Multimedia content</h3> The only supported way to supply multimedia content is via an MPEG-2 transport stream. <p> Finding or creating useful multimedia content for your application is beyond the scope of this article. <p> Note that it is your responsibility to ensure that you are legally permitted to play the content. <h3>Debugging</h3> For robustness, we recommend that you examine the <code>XAresult</code> value which is returned by most APIs. Use of <code>assert</code> vs. more advanced error handling logic is a matter of coding style and the particular API; see the Wikipedia article on <a href="http://en.wikipedia.org/wiki/Assertion_(computing)">assert</a> for more information. In the supplied example, we have used <code>assert</code> for "impossible" conditions which would indicate a coding error, and explicit error handling for others which are more likely to occur in production. <p> Many API errors result in a log entry, in addition to the non-zero result code. These log entries provide additional detail which can be especially useful for the more complex APIs such as <code>Engine::CreateMediaPlayer</code>. <p> Use <a href="http://developer.android.com/guide/developing/tools/adb.html"> adb logcat</a>, the <a href="http://developer.android.com/guide/developing/eclipse-adt.html"> Eclipse ADT plugin</a> LogCat pane, or <a href="http://developer.android.com/guide/developing/tools/ddms.html#logcat"> ddms logcat</a> to see the log. <h2>Supported features from OpenMAX AL 1.0.1</h2> This section summarizes available features. In some cases, there are limitations which are described in the next sub-section. <h3>Global entry points</h3> Supported global entry points: <ul> <li><code>xaCreateEngine</code> <li><code>xaQueryNumSupportedEngineInterfaces</code> <li><code>xaQuerySupportedEngineInterfaces</code> </ul> <h3>Objects and interfaces</h3> The following figure indicates objects and interfaces supported by Android's OpenMAX AL implementation. A green cell means the feature is supported. <p> <img src="chart3.png" alt="Supported objects and interfaces"> <h3>Limitations</h3> This section details limitations with respect to the supported objects and interfaces from the previous section. <h4>Audio</h4> The audio stream type cannot be configured; it is always <code>AudioManager.STREAM_MUSIC</code>. <p> Effects are not supported. <h4>Dynamic interface management</h4> <code>RemoveInterface</code> and <code>ResumeInterface</code> are not supported. <h4>Engine</h4> Supported: <ul> <li><code>CreateMediaPlayer</code> </ul> Not supported: <ul> <li><code>CreateCameraDevice</code> <li><code>CreateRadioDevice</code> <li><code>CreateLEDDevice</code> <li><code>CreateVibraDevice</code> <li><code>CreateMetadataExtractor</code> <li><code>CreateExtensionObject</code> <li><code>GetImplementationInfo</code> </ul> For <code>CreateMediaPlayer</code>, these restrictions apply: <ul> <li>audio sink is an output mix data locator <li>video sink is a native display data locator <li>soundbank, LED array, and vibra sinks must be <code>NULL</code> </ul> <h4>MIME data format</h4> In the current Android implementation of OpenMAX AL, a media player receives its source data as an MPEG-2 transport stream via a buffer queue. <p> The source data locator must be <code>XA_DATALOCATOR_ANDROIDBUFFERQUEUE</code> (see "Android extensions" below). <p> The source data format must be <code>XADataFormat_MIME</code>. Initialize <code>mimeType</code> to <code>XA_ANDROID_MIME_MP2TS</code>, and <code>containerType</code> to <code>XA_CONTAINERTYPE_MPEG_TS</code>. <p> The contained transport stream must have a single program with one H.264 video elementary stream and one AAC audio elementary stream. <h4>Object</h4> <code>Resume</code>, <code>RegisterCallback</code>, <code>AbortAsyncOperation</code>, <code>SetPriority</code>, <code>GetPriority</code>, and <code>SetLossOfControlInterfaces</code> are not supported. <h4>StreamInformation</h4> Use the StreamInformation interface on a media player object to discover when the video metrics (height/width or aspect ratio) are available or changed, and to then get the sizes. <p> Supported: <ul> <li><code>RegisterStreamChangeCallback</code> <li><code>QueryMediaContainerInformation</code> <li><code>QueryStreamInformation</code> <li><code>QueryStreamType</code> </ul> Not supported: <ul> <li><code>QueryActiveStreams</code> <li><code>QueryStreamName</code> <li><code>SetActiveStream</code> </ul> <h4>VideoDecoderCapabilities</h4> This interface on the engine object reports video decoder capabilities without interpretation, exactly as claimed by the underlying OpenMAX IL implementation. <p> These fields in <code>XAVideoCodecDescriptor</code> are filled: <ul> <li><code>codecId</code> <li><code>profileSetting</code> <li><code>levelSetting</code> </ul> The other fields are not filled and should be ignored. <p> Applications should rely on the capabilities documented at <a href="http://developer.android.com/guide/appendix/media-formats.html">Android Supported Media Formats</a>, not the information reported by this interface. <h3>Data structures</h3> Android API level 14 supports these OpenMAX AL 1.0.1 data structures: <ul> <li>XADataFormat_MIME <li>XADataLocator_NativeDisplay <li>XADataLocator_OutputMix <li>XADataSink <li>XADataSource <li>XAEngineOption <li>XAInterfaceID <li>XAMediaContainerInformation <li>XANativeHandle <li>XA*StreamInformation <li>XAVideoCodecDescriptor </ul> <h4>XADataLocator_NativeDisplay</h4> The native display data locator is used to specify the video sink: <pre> typedef struct XADataLocator_NativeDisplay_ { XAuint32 locatorType; // XA_DATALOCATOR_NATIVEDISPLAY XANativeHandle hWindow; // ANativeWindow * XANativeHandle hDisplay; // NULL } XADataLocator_NativeDisplay; </pre> Set the <code>hWindow</code> field to an <code>ANativeWindow *</code> and set <code>hDisplay</code> to <code>NULL</code>. You can get a <code>ANativeWindow *</code> handle from an <code>android.view.Surface</code>, using this NDK function: <pre> #include <android/native_window_jni.h> ANativeWindow* ANativeWindow_fromSurface(JNIEnv* env, jobject surface); </pre> Don't forget to free this handle in your shutdown code with <code>ANativeWindow_release</code>. <h3>Platform configuration</h3> OpenMAX AL for Android is designed for multi-threaded applications, and is thread-safe. <p> OpenMAX AL for Android supports a single engine per application, and up to 32 objects. Available device memory and CPU may further restrict the usable number of objects. <p> <code>xaCreateEngine</code> recognizes, but ignores, these engine options: <ul> <li><code>XA_ENGINEOPTION_THREADSAFE</code> <li><code>XA_ENGINEOPTION_LOSSOFCONTROL</code> </ul> OpenMAX AL and OpenSL ES may be used together in the same application. In this case, there is internally a single shared engine object, and the 32 object limit is shared between OpenMAX AL and OpenSL ES. The application should first create both engines, then use both engines, and finally destroy both engines. The implementation maintains a reference count on the shared engine, so that it is correctly destroyed at the second destroy. <h2>Planning for future versions of OpenMAX AL</h2> The Android native multimedia APIs at level 14 are based on Khronos Group OpenMAX AL 1.0.1 (see section "References" below). As of the time of this writing, Khronos has recently released a revised version 1.1 of the standard. The revised version includes new features, clarifications, correction of typographical errors, and some incompatibilities. Most of the incompatibilities are relatively minor, or are in areas of OpenMAX AL not supported by Android. However, even a small change can be significant for an application developer, so it is important to prepare for this. <p> The Android team is committed to preserving future API binary compatibility for developers to the extent feasible. It is our intention to continue to support future binary compatibility of the 1.0.1-based API, even as we add support for later versions of the standard. An application developed with this version should work on future versions of the Android platform, provided that you follow the guidelines listed in section "Planning for binary compatibility" below. <p> Note that future source compatibility will <i>not</i> be a goal. That is, if you upgrade to a newer version of the NDK, you may need to modify your application source code to conform to the new API. We expect that most such changes will be minor; see details below. <h3>Planning for binary compatibility</h3> We recommend that your application follow these guidelines, to improve future binary compatibility: <ul> <li> Use only the documented subset of Android-supported features from OpenMAX AL 1.0.1. <li> Do not depend on a particular result code for an unsuccessful operation; be prepared to deal with a different result code. <li> Application callback handlers generally run in a restricted context, and should be written to perform their work quickly and then return as soon as possible. Do not do complex operations within a callback handler. For example, within a buffer queue completion callback, you can enqueue another buffer, but do not create a media player. <li> Callback handlers should be prepared to be called more or less frequently, to receive additional event types, and should ignore event types that they do not recognize. Callbacks that are configured with an event mask of enabled event types should be prepared to be called with multiple event type bits set simultaneously. Use "&" to test for each event bit rather than a switch case. <li> Use prefetch status and callbacks as a general indication of progress, but do not depend on specific hard-coded fill levels or callback sequence. The meaning of the prefetch status fill level, and the behavior for errors that are detected during prefetch, may change. </ul> <h3>Planning for source compatibility</h3> As mentioned, source code incompatibilities are expected in the next version of OpenMAX AL from Khronos Group. Likely areas of change include: <ul> <li>Addition of <code>const</code> to input parameters passed by reference, and to <code>XAchar *</code> struct fields used as input values. This should not require any changes to your code. <li>Substitution of unsigned types for some parameters that are currently signed. You may need to change a parameter type from <code>XAint32</code> to <code>XAuint32</code> or similar, or add a cast. <li>Additional fields in struct types. For output parameters, these new fields can be ignored, but for input parameters the new fields will need to be initialized. Fortunately, these are expected to all be in areas not supported by Android. <li>Interface <a href="http://en.wikipedia.org/wiki/Globally_unique_identifier"> GUIDs</a> will change. Refer to interfaces by symbolic name rather than GUID to avoid a dependency. <li><code>XAchar</code> will change from <code>unsigned char</code> to <code>char</code>. This primarily affects the MIME data format. <li><code>XADataFormat_MIME.mimeType</code> will be renamed to <code>pMimeType</code>. We recommend that you initialize the <code>XADataFormat_MIME</code> data structure using a brace-enclosed comma-separated list of values, rather than by field name, to isolate your code from this change. In the example code we have used this technique. </ul> <h2>Android extensions</h2> The API for Android extensions is defined in <code>OMXAL/OpenMAXAL_Android.h</code> . Consult that file for details on these extensions. Unless otherwise noted, all interfaces are "explicit". <p> Note that use these extensions will limit your application's portability to other OpenMAX AL implementations. If this is a concern, we advise that you avoid using them, or isolate your use of these with <code>#ifdef</code> etc. <p> The following figure shows which Android-specific interfaces and data locators are available for each object type. <p> <img src="chart4.png" alt="Android extensions"> <h3>Android buffer queue data locator and interface</h3> <h4>Comparison with OpenSL ES buffer queue</h4> The Android buffer queue data locator and interface are based on similar concepts from OpenSL ES 1.0.1, with these differences: <ul> <li>Though currently usable with only a media player and MPEG-2 transport stream data, the Android buffer queue API is designed for flexibility so that the API can also apply to other use cases in the future. <li>Commands may be optionally specified by the application at time of <code>Enqueue</code>. Each command consists of an item key and optional item value. Command key/value pairs are carried alongside the corresponding buffer in the queue, and thus are processed in synchrony with the buffer. <li>To enqueue command(s) without associated data, specify a buffer address of NULL and buffer size of zero, along with at least one command. <li>Status may be provided by the implementation during a completion callback. Each status consists of an item key and optional item value. Status key/value pairs are carried alongside the corresponding buffer in the queue, and thus are received by the application in synchrony with the completion callback. <li>The completion callback receives additional parameters: buffer address, buffer maximum data size, buffer actual size consumed (or filled by a future recorder object), and a <code>void *</code> for application. <li>The callback returns a value, which must be <code>XA_RESULT_SUCCESS</code>. </ul> The data locator type code is <code>XA_DATALOCATOR_ANDROIDBUFFERQUEUE</code> and the associated structure is <code>XADataLocator_AndroidBufferQueue</code>. <p> The interface ID is <code>XA_IID_ANDROIDBUFFERQUEUESOURCE</code>. <h4>Usage</h4> A typical buffer queue configuration is 8 buffers of 1880 bytes each. <p> The application enqueues filled buffers of data in MPEG-2 transport stream format. The buffer size must be a multiple of 188 bytes, the size of an MPEG-2 transport stream packet. The buffer data must be properly aligned on a packet boundary, and formatted per the MPEG-2 Part 1 specification. <p> An application may supply zero or one of these item codes (command key/value pairs) at <code>Enqueue</code>: <dl> <dt>XA_ANDROID_ITEMKEY_EOS</dt> <dd>End of stream. Informs the decode and rendering components that playback is complete. The application must not call <code>Enqueue</code> again. There is no associated value, so <code>itemSize</code> must be zero. There must be no data buffer alongside the EOS command. </dd> <dt>XA_ANDROID_ITEMKEY_DISCONTINUITY</dt> <dd>Discontinuity. This and following buffers have a new presentation time. The new presentation time may be optionally specified as a parameter, expressed in <code>itemData</code> as a 64-bit unsigned integer in units of 90 kHz clock ticks. The <code>itemSize</code> should be either zero or 8. The discontinuity command is intended for seeking to a new point in the stream. The application should flush its internal data, then send the discontinuity command prior to, or alongside of, the first buffer corresponding to the new stream position. The initial packets in the video elementary stream should describe an IDR (Instantaneous Decoding Refresh) frame. Note that the discontinuity command is not intended for stream configuration / format changes; for these use <code>XA_ANDROID_ITEMKEY_FORMAT_CHANGE</code>. </dd> <dt>XA_ANDROID_ITEMKEY_FORMAT_CHANGE</dt> <dd>Format change. This and following buffers have a new format, for example for MBR (Multiple Bit Rate) or resolution switching. </dd> </dl> <p> Upon notification of completion via a registered callback, the now consumed buffer is available for the application to re-fill. <p> The implementation may supply zero or more of these item codes (status key/value pairs) to the callback handler: <dl> <dt>XA_ANDROID_ITEMKEY_BUFFERQUEUEEVENT</dt> <dd>Buffer queue event mask. The <code>itemSize</code> is 4, and <code>itemData</code> contains the bit-wise "or" of zero or more of the <code>XA_ANDROIDBUFFERQUEUEEVENT_*</code> symbols. This event mask explains the reason(s) why the callback handler was called.</dd> </dl> <h3>Reporting of extensions</h3> <code>Engine::QueryNumSupportedExtensions</code>, <code>Engine::QuerySupportedExtension</code>, <code>Engine::IsExtensionSupported</code> report these extensions: <ul> <li><code>ANDROID_SDK_LEVEL_#</code> where # is the platform API level, 14 or higher </ul> <h2>Programming notes</h2> These notes supplement the OpenMAX AL 1.0.1 specification, available in the "References" section below. <h3>Objects and interface initialization</h3> Two aspects of the OpenMAX AL programming model that may be unfamiliar to new developers are the distinction between objects and interfaces, and the initialization sequence. <p> Briefly, an OpenMAX AL object is similar to the object concept in programming languages such as Java and C++, except an OpenMAX AL object is <i>only</i> visible via its associated interfaces. This includes the initial interface for all objects, called <code>XAObjectItf</code>. There is no handle for an object itself, only a handle to the <code>XAObjectItf</code> interface of the object. <p> An OpenMAX AL object is first "created", which returns an <code>XAObjectItf</code>, then "realized". This is similar to the common programming pattern of first constructing an object (which should never fail other than for lack of memory or invalid parameters), and then completing initialization (which may fail due to lack of resources). The realize step gives the implementation a logical place to allocate additional resources if needed. <p> As part of the API to create an object, an application specifies an array of desired interfaces that it plans to acquire later. Note that this array does <i>not</i> automatically acquire the interfaces; it merely indicates a future intention to acquire them. Interfaces are distinguished as "implicit" or "explicit". An explicit interface <i>must</i> be listed in the array if it will be acquired later. An implicit interface need not be listed in the object create array, but there is no harm in listing it there. OpenMAX AL has one more kind of interface called "dynamic", which does not need to be specified in the object create array, and can be added later after the object is created. The Android implementation provides a convenience feature to avoid this complexity; see section "Dynamic interfaces at object creation" above. <p> After the object is created and realized, the application should acquire interfaces for each feature it needs, using <code>GetInterface</code> on the initial <code>XAObjectItf</code>. <p> Finally, the object is available for use via its interfaces, though note that some objects require further setup. Some use cases needs a bit more preparation in order to detect connection errors. See the next section "Media player prefetch" for details. <p> After your application is done with the object, you should explicitly destroy it; see section "Destroy" below. <h3>Media player prefetch</h3> For a media player, <code>Object::Realize</code> allocates resources but does not connect to the data source (i.e. "prepare") or begin pre-fetching data. These occur once the player state is set to either <code>XA_PLAYSTATE_PAUSED</code> or <code>XA_PLAYSTATE_PLAYING</code>. <p> The prefetch status interface is useful for detecting errors. Register a callback and enable at least the <code>XA_PREFETCHEVENT_FILLLEVELCHANGE</code> and <code>XA_PREFETCHEVENT_STATUSCHANGE</code> events. If both of these events are delivered simultaneously, and <code>PrefetchStatus::GetFillLevel</code> reports a zero level, and <code>PrefetchStatus::GetPrefetchStatus</code> reports <code>XA_PREFETCHSTATUS_UNDERFLOW</code>, then this indicates a non-recoverable error in the data source or in rendering to the video sink. <p> The next version of OpenMAX AL is expected to add more explicit support for handling errors in the data source. However, for future binary compatibility, we intend to continue to support the current method for reporting a non-recoverable error. <p> In summary, a recommended code sequence is: <ul> <li>Engine::CreateMediaPlayer <li>Object:Realize <li>Object::GetInterface for XA_IID_PREFETCHSTATUS <li>PrefetchStatus::SetCallbackEventsMask <li>PrefetchStatus::RegisterCallback <li>Object::GetInterface for XA_IID_PLAY <li>Play::SetPlayState to XA_PLAYSTATE_PAUSED or XA_PLAYSTATE_PLAYING <li>preparation and prefetching occur here; during this time your callback will be called with periodic status updates and error notifications </ul> <h3>Destroy</h3> Be sure to destroy all objects on exit from your application. Objects should be destroyed in reverse order of their creation, as it is not safe to destroy an object that has any dependent objects. For example, destroy in this order: audio players and recorders, output mix, then finally the engine. <p> OpenMAX AL does not support automatic garbage collection or <a href="http://en.wikipedia.org/wiki/Reference_counting">reference counting</a> of interfaces. After you call <code>Object::Destroy</code>, all extant interfaces derived from the associated object become <i>undefined</i>. <p> The Android OpenMAX AL implementation does not detect the incorrect use of such interfaces. Continuing to use such interfaces after the object is destroyed will cause your application to crash or behave in unpredictable ways. <p> We recommend that you explicitly set both the primary object interface and all associated interfaces to NULL as part of your object destruction sequence, to prevent the accidental misuse of a stale interface handle. <h3>Callbacks and threads</h3> Callback handlers are generally called <i>synchronously</i> with respect to the event, that is, at the moment and location where the event is detected by the implementation. But this point is <i>asynchronous</i> with respect to the application. Thus you should use a mutex or other synchronization mechanism to control access to any variables shared between the application and the callback handler. In the example code, such as for buffer queues, we have omitted this synchronization in the interest of simplicity. However, proper mutual exclusion would be critical for any production code. <p> Callback handlers are called from internal non-application thread(s) which are not attached to the Dalvik virtual machine and thus are ineligible to use JNI. Because these internal threads are critical to the integrity of the OpenMAX AL implementation, a callback handler should also not block or perform excessive work. Therefore, if your callback handler needs to use JNI or do anything significant (e.g. beyond an <code>Enqueue</code> or something else simple such as the "Get" family), the handler should instead post an event for another thread to process. <p> Note that the converse is safe: a Dalvik application thread which has entered JNI is allowed to directly call OpenMAX AL APIs, including those which block. However, blocking calls are not recommended from the main thread, as they may result in the dreaded "Application Not Responding" (ANR). <h3>Performance</h3> As OpenMAX AL is a native C API, non-Dalvik application threads which call OpenMAX AL have no Dalvik-related overhead such as garbage collection pauses. However, there is no additional performance benefit to the use of OpenMAX AL other than this. In particular, use of OpenMAX AL does not result in lower audio or video latency, higher scheduling priority, etc. than what the platform generally provides. On the other hand, as the Android platform and specific device implementations continue to evolve, an OpenMAX AL application can expect to benefit from any future system performance improvements. <h3>Security and permissions</h3> As far as who can do what, security in Android is done at the process level. Java programming language code can't do anything more than native code, nor can native code do anything more than Java programming language code. The only differences between them are what APIs are available that provide functionality that the platform promises to support in the future and across different devices. <p> Applications using OpenMAX AL must request whatever permissions they would need for similar non-native APIs. Applications that play network resources need <code>android.permission.NETWORK</code>. Note that the current Android implementation does not directly access the network, but many applications that play multimedia receive their data via the network. <h2>Platform issues</h2> This section describes known issues in the initial platform release which supports these APIs. <h3>Dynamic interface management</h3> <code>DynamicInterfaceManagement::AddInterface</code> does not work. Instead, specify the interface in the array passed to Create. <h2>References and resources</h2> Android: <ul> <li><a href="http://developer.android.com/resources/index.html"> Android developer resources</a> <li><a href="http://groups.google.com/group/android-developers"> Android developers discussion group</a> <li><a href="http://developer.android.com/sdk/ndk/index.html">Android NDK</a> <li><a href="http://groups.google.com/group/android-ndk"> Android NDK discussion group</a> (for developers of native code, including OpenMAX AL) <li><a href="http://code.google.com/p/android/issues/"> Android open source bug database</a> </ul> Khronos Group: <ul> <li><a href="http://www.khronos.org/openmax/al/"> Khronos Group OpenMAX AL Overview</a> <li><a href="http://www.khronos.org/registry/omxal/"> Khronos Group OpenMAX AL 1.0.1 specification</a> <li><a href="http://www.khronos.org/message_boards/viewforum.php?f=30"> Khronos Group public message board for OpenMAX AL</a> (please limit to non-Android questions) </ul> For convenience, we have included a copy of the OpenMAX AL 1.0.1 specification with the NDK in <code>docs/openmaxal/OpenMAX_AL_1_0_1_Specification.pdf</code>. <p> Miscellaneous: <ul> <li><a href="http://en.wikipedia.org/wiki/Java_Native_Interface">JNI</a> <li><a href="http://en.wikipedia.org/wiki/MPEG-2">MPEG-2</a> <li><a href="http://en.wikipedia.org/wiki/MPEG_transport_stream">MPEG-2 transport stream</a> <li><a href="http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC">H.264</a> <li><a href="http://en.wikipedia.org/wiki/Advanced_Audio_Coding">AAC</a> </ul> </body> </html>