We'll start by writing the MIDlet and a Form
that will display the system properties related to MMA capabilities, as shown in Listing 1. Use your favorite editor (Notepad will also suffice) to edit the code. Remember to save files under
Listing 1. Our main MIDlet class: MobileVideoApp
package com.srijeeb.jme;
import javax.microedition.lcdui.*;
import javax.microedition.media.*;
import javax.microedition.midlet.MIDlet;
public class MobileVideoApp extends MIDlet {
private Display display;
private PropertyForm form;
public MobileVideoApp() {
form = new PropertyForm("Mobile Video App",this);
}
public void startApp() {
display = Display.getDisplay(this);
display.setCurrent(form);
}
public void pauseApp() {
}
public void destroyApp(boolean unconditional) {
}
public Display getDisplay() {
return display;
}
}
I won't be explaining such methods as startApp()
, pauseApp()
, or destroyApp()
, because these are generic for any MIDlet application, and they require no special attention for writing our example mobile media application. PropertyForm
is a class that we have extended from the javax.microedition.lcdui.Form
class. Listing 2 shows PropertyForm
's important sections.
Listing 2. PropertyForm code for displaying MMA-specific properties
package com.srijeeb.jme;
import javax.microedition.lcdui.*;
import javax.microedition.media.*;
public class PropertyForm extends Form
implements CommandListener {
private final static Command CMD_EXIT =
new Command("Exit", Command.EXIT, 1);
private MobileVideoApp parentMidlet = null;
protected PropertyForm(String in, MobileVideoApp parentMidlet_) {
super(in);
this.parentMidlet = parentMidlet_;
initComponents();
}
public void initComponents() {
append(JMEUtility.getImage("/images/banner.png"));
append(JMEUtility.getImage("/images/separator.png"));
append(getStringItem("version", "microedition.media.version"));
append(JMEUtility.getImage("/images/separator.png"));
append(getStringItem("Audio Capture", "supports.audio.capture"));
append(JMEUtility.getImage("/images/separator.png"));
append(getStringItem("Video Capture", "supports.video.capture"));
append(JMEUtility.getImage("/images/separator.png"));
append(getStringItem("Recording", "supports.recording"));
append(JMEUtility.getImage("/images/separator.png"));
append(getStringItem("Audio Enc", "audio.encodings"));
append(JMEUtility.getImage("/images/separator.png"));
append(getStringItem("Video Enc", "video.encodings"));
append(JMEUtility.getImage("/images/separator.png"));
append(getStringItem("Video Snp Enc", "video.snapshot.encodings"));
append(JMEUtility.getImage("/images/separator.png"));
append(getStringItem("Stream Cont", "streamable.contents"));
append(JMEUtility.getImage("/images/separator.png"));
append(JMEUtility.getImage("/images/separator.png"));
append(getSupportedProtocols());
append(JMEUtility.getImage("/images/separator.png"));
append(getSupportedContentTypeForHttp());
addCommand(CMD_EXIT);
setCommandListener(this);
}
private StringItem getStringItem(String name, String propertyName) {
String value = System.getProperty(propertyName);
return new StringItem("[" + name + "]", value);
}
private StringItem getSupportedProtocols() {
return new StringItem("[Protocols]",
concatArray(Manager.getSupportedProtocols(null)) );
}
private StringItem getSupportedContentTypeForHttp() {
return new StringItem("[Content http]",
concatArray(Manager.getSupportedContentTypes("http")) );
}
public void commandAction(Command c, Displayable d) {
if (c == CMD_EXIT) {
parentMidlet.destroyApp(true);
parentMidlet.notifyDestroyed();
}
}
public String concatArray(String[] list) {
String ret = "";
if ( list != null && list.length > 0 ) {
for ( int i=0; i < list.length; i++ ) {
ret += list[i];
if ( i < (list.length-1)) {
ret += "|";
}
}
}
return ret;
}
}
You will have noticed the use of the JMEUtility
class in Listing 2. This is a small class that I have written to perform some utility tasks -- loading images, showing error messages and so on. It's not that important in the context of our article; you can find out more by examining JMEUtility.java, which is part of the source code supplied with this article. For the time being, assume that once we call JMEUtility.getImage(String)
, it loads the image mentioned in the parameter and caches it. The next time the same method call occurs, it returns the same image from the cache. (The reasons for caching the image have to do with design patterns and best practices for mobile application development, and are outside the scope of this article.)
Our main intention in Listing 2 is to retrieve the MMA-related properties from the device and display them in a form. Effectively, if we call System.getProperty(propertyName)
, the mobile device should return the value for the particular property passed in the parameter. For example, if we call System.getProperty("supports.recording")
, it will return a value of true
for the Nokia 3230.
The important property names related to MMAPI are as follows:
microedition.media.version
: The version of the MMAPI specification that is implemented.supports.audio.capture
: Is audio capture supported? The string returned is eithertrue
orfalse
.supports.video.capture
: Is video capture supported? The string returned is eithertrue
orfalse
.supports.recording
: Is recording supported? The string returned is eithertrue
orfalse
.audio.encodings
: The string returned specifies the supported audio-capture formats.video.encodings
: The string returned specifies the supported video-capture formats.video.snapshot.encodings
: The string returned specifies the video snapshot formats for thegetSnapshot()
method inVideoControl
.streamable.contents
: The string returned specifies the supported streamable content types.
You may encounter some confusion when trying to reconcile some of the property values mentioned above with what you observe on a real device. For example: the Nokia 3230 and the Nokia 6600 both return true
for both the supports.video.capture
and the supports.recording
properties. So it seems that both devices will support video recording. But there is a catch. If supports.recording
returns true
, you can record media using at least one player type -- at least one, but not necessarily all. The Nokia 6600 supports the recording of audio, but not the recording of video. But the Nokia 3230 also supports video recording.
You might also have noticed two other method calls in Listing 2: getSupportedProtocols()
and getSupportedContentTypeForHttp()
. Let's take a look at these in more detail.
Inside the getSupportedProtocol()
method, we have queried the Manager
class (javax.microedition.media.Manager
) to return the protocols it can handle to retrieve media. A Manager.getSupportedProtocols(null)
call will return all the supported protocols by the manager. The actual method call looks like this:
public static java.lang.String[] getSupportedProtocols(java.lang.String content_type)
Here, if the given content type is video/mpeg
, then the supported protocols that can be used to play back MPEG video will be returned. If null
is passed in as the content type, all the supported protocols for this implementation will be returned.
Inside the getSupportedContentTypeForHttp()
method, we have queried the Manager
class to return the supported content types it can handle for the HTTP protocol. The Manager.getSupportedContentTypes("http")
call will return all the supported content types that can be delivered over the HTTP protocol to the player created by this Manager
. The actual method call looks like this:
public static java.lang.String[] getSupportedContentTypes(java.lang.String protocol)
For example, if the given protocol is http
, then the supported content types that can be played back with the HTTP protocol will be returned.
Now it's time to compile and test our code snippet using the Wireless Toolkit. But before doing so, if you are creating and coding the example from scratch (that is, if you have not downloaded the source supplied with this article), then you need to copy two images (banner.png and separator.png) from the supplied source and put them into the
0 comments:
Post a Comment