As you might figure by now, the common feature of these applications is to display a video with customised media player controls. I thought it would be a good time to share some knowledge and show some ideas around video playing on Android. A couple of months ago I a gave a talk about it at Droidcon Berlin showing some code but I never had the proper amount of time to make it into a library.
Media Player lifecycle
The first thing we need to understand before playing videos or audio is the lifecycle of the Media Player
As you see it’s pretty straight forward, there are only 10 different states…
The most important piece of information that can be extracted from this diagram is that one needs to be very strict regarding the
behaviour of the
MediaPlayer. It’s very easy to follow the steps in the wrong order which will lead to frustration since debug information that the
MediaPlayer provides is pretty bad:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21
This information can be accessed using two different listeners from the
and onErrorListener. We are learning all this before actually displaying
any content, and you thought this was going to be easy!
Now we understand how the
MediaPlayer works, the next step is to decide how are we going to display the video, what
View are we going to use?
TextureView vs SurfaceView
The easiest way of displaying a video in Android is using a VideoView. With very few lines of code you’ll be able to display a video from a local source or from the internet and it will just play. This is a very good approach if you don’t need any fancy stuff, no customisation or no callbacks. In our case, customisation was needed hence callbacks were needed and this solution was simply just not good enough.
In the first place my thought was to extend
VideoView and add some functionality but if you’ve checked the source code,
VideoView is extending
SurfaceView and that has some limitations and constraints.
SurfaceView provides a dedicated drawing surface embedded inside of a view hierarchy,
the layout of which can be defined, in size and format. It also takes care of placing the surface at the correct location on the screen.
The main limitation is that SurfaceView can’t be animated, moved or transformed, that’s because it’s created in a separate window and not
treated as a
View. That was a big problem considering what we wanted to achieve so we turned our eyes
It’s not all bad news for
SurfaceView, Chrome used to use
TextureView as the compositor target surface but they went back to using
SurfaceView. These are the main reasons:
- Because of its invalidation and buffering behaviour, TextureView adds 1-3 extra frames of latency to display updates.
- TextureView is always composited using GL, whereas SurfaceTexture can be backed by a hardware overlay which uses less memory bandwidth and power.
- The internal buffer queue of TextureView can end up using more memory than a SurfaceView.
- We weren’t really doing anything useful with the animation and transform capabilities of TextureView.
Full discussion in Google Groups can be found here
We wanted to transform and animate that view so our choice was already made.
So the video is going to be displayed in a
TextureView, now we need a way of interacting with the video. Yes, the media player controls.
As you can see we decided to use a similar approach to what Youtube is doing, the custom controller occupies the whole screen making it very easy to use. Quite frankly this is much more appealing from my point of view. We’ve all seen these designs for video players where the controller is aligned to the bottom of the screen (inspired in certain other platform) and I certainly believe this approach makes more sense.
1 2 3 4 5 6 7 8 9 10
The most important part of this custom controller is the root layout which is going to enable / disable the controller itself. That is the
it’s nothing else than a
FrameLayout we put as a layout root to get notified about any screen touches. The
PlayerController will set a listener and show or
hide the view accordingly
How does the PlayerController work?
Basically, the big part of the job is done when the
PlayerController needs to be attached. It’s the
TextureView who takes care of this job, setting an anchor
view which is the
Once this view is anchored it’s just a matter of setting listeners to all the events we want to track (Play/Pause/Seek) and handle the communication
between the view and the
1 2 3 4 5 6 7 8 9 10 11 12
At this point we know how the controller is shown / hidden and which component is that view anchored to, but we are missing the big part of this puzzle
which is the View who handles it all, the
This view is the only one communicating with the
MediaPlayer, which allows us to handle on the interaction between the controller and the player
transparently to the other parts of the application. Going back to the initial state diagram, one can see that it would be very helpful to save
the actual state of the player in order to know which steps can be followed. The
TextureVideoView keeps track internally of that state.
Another important aspect of the view consists in handling all the listeners that the
1 2 3 4 5 6 7 8 9 10 11 12 13 14
As you can see, there is a big chunk of listeners to be set here, not the nicest looking part but the most helpful of all. Not only we’ve set
all the listeners but also created the SurfaceTexture where the video
will be displayed, set the DataSource with the video information and last but not least important, asked the
MediaPlayer to prepare itself in order
to start playing the video.
Important here not to call
prepare() but use
prepareAsync() instead. As you can imagine the
prepare() method is run in the UI thread which will block
the application, by using the
onPreparedListener we’ll be notified when the
MediaPlayer can start playing the video. It’s just another listener and
we love them, don’t we?
Notify when the video is playing
From all those listeners that we’ve set earlier, perhaps the one which tells us more about the
MediaPlayer status is the
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22
These three flags give us vital information, they will tell us when the first frame of the video starts to play, when the MediaPlayer starts buffering and when the video is playing again after buffering. Finally all this information will be sent to the PlayerController in order to show / hide the controls and the loading view.
To wrap things up, it’s always better to have a project to look into it. That’s why I’ve created Fenster, a library where you’ll find all these views we’ve been talking about in this post and a nice demo project to show it all.
As always, fork it, use it, and if you want to contribute pull requests are more than welcome!