This action might not be possible to undo. Are you sure you want to continue?
• Access to devices & media files
– Sun JMF web site
– Sample chapter from book, “Essential JMF - Java
Media Framework” Gordon & Talley, 1998
• http://www.javaolympus.com/J2SE/MEDIA/JMF/JMF.jsp – Article “Image Capture From Webcams using the Java
Media Framework API”
• http://java.sun.com/dev/evangcentral/totallytech/jmf.html (stale)
– Terrazas et al. (2002). Java Media API’s. SAMS Pub.
• General contents of JMF • Digital video primer • Classes
• Abstract Window Toolkit class
• Pixel representation • Classes – javax.media.Format; – javax.media.MediaLocator;
– javax.media.protocol.DataSource; – javax.media.CaptureDeviceManager;
Digital Video Primer
• Cameras return “frames” of visual information • Each frame is a 2D matrix
– Components of these arrays are pixels
• Usually integers
– Dimensions: height by width
• E.g., 160 x 120 pixels
• Frames are returned from the camera at a certain rate
– I.e., the frame rate – E.g., 15 frames per second
• Frames are often represented in programs as one-dimensional arrays
int frame; // in Java
// get some pixel // 0 <= x <= width-1. 0 <= y <= height-1 // (0. // assumes a row oriented representation 4 .upper left of image int pixel = frame[y * width + x]. y = 20.Frames as 1D Arrays int  frame = null. 0)-. // code to get frame from camera… int x = 10.
awt.image. plus color model information • Represents color or grayscale images in various formats – E.ColorModel) describes the format of the image public BufferedImage getSubimage(int x.BufferedImage • An Image comprised of a data buffer for image.image. TYPE_USHORT_565_RGB. TYPE_INT_RGB • Methods include public ColorModel getColorModel() • ColorModel (java. TYPE_BYTE_GRAY. int y.awt. int w..g.Class java. int h) public int getRGB(parameter list…) 5 . TYPE_USHORT_GRAY.
Class java.ColorModel • Abstract class used to construct representations of colors (color models) for images • Methods include – public abstract int getRed(int pixel) – public abstract int getGreen(int pixel) – public abstract int getBlue(int pixel) 6 .image.awt.
Class javax.format.” (From JMF API) 7 .RGBFormat • The RGBFormat class of the javax.media.format package – Subclass of VideoFormat (javax.VideoFormat) • “32-bit packed RGB stored as 32-bit integers would have the following masks: redMask = 0x00FF0000.media. greenMask = 0x0000FF00.media. blueMask = 0x000000FF.format.
format.RGBFormat) image formats were obtained from the camera 8 .media.Pixel Representation • Pixels as returned from the BufferedImage class getRGB method – Represented using lower 24 bits of int’s • High order byte: Red • Middle byte: Green • Low-order byte: Blue • This format because the Java programs we are using set up the image formats this way – RGBFormat (javax.
G. and B components 9 . G. and B components • 2) Use bit manipulation operations to create a new RGBFormat pixel given 8 bit R.Example • 1) Given a pixel in RGBFormat as described. use bit manipulation operations to extract out the R.
getDeviceList • Obtain a DataSource – Use Manager to create this DataSource. based on a MediaLocator • Obtain a Processor – From Manager based on the DataSource • Obtain a PushBufferDataSource – From the Processor 10 .Outline of Using JMF to Access USB Camera Devices • Determine available camera devices – CaptureDeviceManager.
lang. then all formats (including both YUV and RGB) will be returned in the result of the getDeviceList call • If Format is null.CaptureDeviceManager • Methods public static CaptureDeviceInfo getDevice(java.Vector getDeviceList(Format format) • Returns a Vector containing a list of CaptureDeviceInfo descriptions of devices available on the computer of the format given – Formats returned may be more than that specified – E. returns the list of all available devices on the system 11 .g.javax. if a given device supports YUV and RGB formats.util..String deviceName) • Get a device by name Class public static java.media.
media.CaptureDeviceInfo • Methods include public Format getFormats() • Returns an array of Format objects • Specific Format’s available for the video device public MediaLocator getLocator() 12 .Class javax.
for an a audio device • "PCM. 16-bit. LineStride=320. Masks=31744:992:31. 44..g.String encoding) • Device/media specific description of a media format – E. Length=38400.. for a video (visual) device • "RGB. Flipped” – E. 160x120.Format • Format class of javax.media. PixelStride=2.lang. Stereo.g. Signed" 13 .1 KHz.Class javax.media package • Constructor Format(java.
.g.mov”).media.lang.Class javax. MediaLocator m = new MediaLocator(“file://media/example.String locatorString) • Locator strings are similar to URLs • Example locatorString – “vfw://0” • This is for my USB web camera • MediaLocator’s can describe files e. 14 .MediaLocator • Constructor – MediaLocator(java.
DataSink’s. DataSource’s.Manager . or manipulate the information (via a Processor) 15 .media. Processor’s. ftp. • Examples of protocols: http.1 • Access point for obtaining system dependent resources – Player’s. TimeBase • DataSource – Object used to deliver time-based multimedia data that is specific to a delivery protocol.Class javax. file – Once you have a “usable” (formatted) DataSource. you can display the media (via a Player).
When you call createDataSource on the media locator of a video capture device (obtained from the CaptureDeviceManager). the returned DataSource will implement the CaptureDevice interface public static DataSource createCloneableDataSource(DataSource source) .media.Manager .2 • Manager methods include public static DataSource createDataSource(MediaLocator sourceLocator) Returns a data source for the protocol specified by the MediaLocator. this enables DataSource to be processed by different tasks • public static Processor createProcessor(DataSource source) .create a DataSource that can be cloned.Class javax.Processors are used in controlling the processing of media 16 .
DataSource manages the life-cycle of the media source by providing a simple connection protocol.protocol. stop • Start and stop data transfer 17 . • Methods include – DataSource(MediaLocator source) – connect – start.DataSource • An abstraction for media protocol-handlers.media.Class javax.
These SourceStreams provide the interface for the captured data streams to be read. (From JMF API) • Methods include – public FormatControl getFormatControls() 18 .Interface CaptureDevice • A capture device is a DataSource of type PullDataSource. PullBufferDataSource. PushDataSource or PushBufferDataSource. It also implements the CaptureDevice interface… A CaptureDevice DataSource contains an array of SourceStream's.
Interface FormatControl • Objects implementing the FormatControl interface can be used to set the Format of the CaptureDevice to the Format desired • Methods – public Format setFormat(Format format) • Returns null if the format is not supported. it (typically) returns the format that's actually set. 19 . Otherwise.
Interface javax.media.to form an interleaved stream • Data transcoding and multiplexing processes are programmable. Unlike a Player. Extends the Player interface. which processes data as a "black box" and only renders data to preset destinations. 20 . • Processing performed by a Processor is split into three stages: – Demultiplexing .into separate tracks of data – Data transcoding .of each track into other formats – Multiplexing .Processor • Processes and controls time-based media data. a Processor supports a programmatic interface enabling control over media data processing and access to output data streams.
Realizing.media.e. public void prefetch() public void start() public int getState() • States: Unrealized. Configuring. Prefetched.Processor • Processor is a Player is a Controller • Methods include DataSource getDataOutput() get the output DataSource from the Processor public void addControllerListener(ControllerListener listener) • From Controller interface • Events get posted to the listener. the public void controllerUpdate(ControllerEvent event) method gets called for that listener public void realize() • Constructs media dependent portions of Controller. Prefetching.. I. 21 .Interface javax. Realized. Configured. and Started.
The realize method allows an application to initiate this potentially timeconsuming process (Realizing) at an appropriate time. it's in the Unrealized state.Controller Life Cycle • Controller’s have five resource-allocation states: Unrealized. the Controller performs the communication necessary to locate all of the resources it needs to function (such as communicating with a server. Realizing. other controllers. • These states provide programmatic control over potentially time-consuming operations. 22 . Prefetching. when a Controller is first constructed. and Prefetched. or a file system). While Realizing. For example. Realized.
The ContentDescriptor of this DataSource provides the only indication of what streams can be available on this connection.Class javax. • For our USB camera capture device. there is only one 23 PushBufferStream . The collection of streams is entirely content dependent.protocol. and this DataSource is actually a PushBufferDataSource • Methods public abstract PushBufferStream getStreams() • Return streams that this source manages. The streams from this DataSource are PushBufferStream’s and contain Buffer objects • We we use getDataOutput to get a DataSource from our Processor.PushBufferDataSource • A data source that manages data in the form of push streams.media.
The media object transferred is the Buffer object as defined in javax. 24 .protocol.media. sets them on the Buffer and sends them to the user.media. The user of the stream will allocate an empty Buffer object and pass this to the source stream in the read() method.Buffer. Allows a source stream to transfer data in the form of a media object. The source stream allocates the Buffer object's data and header.PushBufferStream • A read interface that pushes data in the form of Buffer objects.Interface javax.
25 .protocol.PushBufferStream • Methods include public Format getFormat() public void setTransferHandler(BufferTransferHandler transferHandler) • Register an object to service data transfers to this stream.Interface javax. • (Not documented regarding what happens if you read and there is no data available-.Is the buffer empty or is an IOException raised).media. public void read(Buffer buffer) • Read from the stream without blocking.
• Methods include public void transferData(PushBufferStream stream) • Notification from the PushBufferStream to the handler that data is available to be read from stream.protocol. • A PushBufferStream needs to notify the data handler when data is available to be pushed.media.Interface javax. 26 . The data can be read by this handler in the same thread or can be read later.BufferTransferHandler • Implements a callback from a PushBufferStream.
• Maintains information including the time stamp. and Format of the data it carries. length. Buffer objects are also used to carry data between a buffer stream and its handler.Class javax.Buffer • A media-data container that carries media data from one processing stage to the next inside of a Player or Processor. 27 . as well as any header information that might be required to process the media data.media.
Image createImage(Buffer buffer) Converts the input Buffer to a standard AWT image and returns the image.util.BufferToImage • A utility class to convert a video Buffer object to an AWT Image object – you can then render the Image to the user display with AWT methods. 28 . • Methods include public BufferToImage(VideoFormat format) public java.awt.media.Class javax.
TYPE_INT_RGB • Methods include public ColorModel getColorModel() • ColorModel (java. plus color model information • Represents color or grayscale images in various formats – E.image.image. TYPE_USHORT_GRAY. int y. int h) public int getRGB(parameter list…) 29 . TYPE_USHORT_565_RGB.awt.Image • An Image comprised of a data buffer for image.g.BufferedImage – A subclass of java.ColorModel) describes the format of the image public BufferedImage getSubimage(int x.(From Above) Class java.. TYPE_BYTE_GRAY. int w.awt.awt.
0. int CurrentFrame = InputImage.Capturing the Webcam Images PushBufferStream PushBuffer. BufferToImage ImageConverter = new BufferToImage((VideoFormat) (PushBuffer.read(ImageBuffer).getFormat())). // assume a new frame is ready PushBuffer.createImage(ImageBuffer))).java (Lab1) 30 . From ImageProcessingThread. // created above Buffer ImageBuffer = new Buffer(). BufferedImage InputImage = ((BufferedImage) (ImageConverter.getRGB(0. VideoWidth. VideoHeight. VideoWidth). null. 0.
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue listening from where you left off, or restart the preview.