This 360AnDev session will focus on the more intricate details of implementing Google Cast. To get everyone on the same page, it will start with a brief overview of the platform and the components involved. From there we will go into best practices for common features, pointing out pitfalls along the way.
Topics covered will include queuing, auto play, audio devices, authentication, analytics, and supporting ads. Each topic will include specific use cases and how best to implement them. In addition, any updates announced at Google I/O will be included in the session. Code examples will focus on receivers and Android senders, but could be applied to iOS and Chrome senders as well.
Introduction (0:00)
My name is Caleb Smith, and I’m going to talk about Google Cast. First I want to start with a review of the platform.
In 2013, Chromecast was announced. Since then, it was kind of quiet for a while, and then 2015, we saw Google Cast announced. That was really where we saw the framework separated from the device that it runs on. Then later that year we saw the Game Manager and Remote Display APIs released, which made things a lot easier for people using Unity. There’s an NDK plugin too. In September of that year, we got our second-gen Chromecast, which were beefed up. We got 5 GHz WiFi, Bluetooth LE , and we also got an audio-only device. Later that year, the audio device got support for multi-room, which is pretty cool. And then more recently, we’ve had a bunch of announcements for speaker and TV partnerships.
Nowadays, a lot of the devices you’re buying in your normal entertainment outlet support Google Cast.
To start off, I want to go over a few terms:
- The first is senders. When I say sender, we’re talking about the mobile devices or the browsers that the user is using to launch that Cast session.
- Then we have receivers. These are the actual apps that are running on our Cast devices. We also have our Chromecast speakers and TVs.
Now the framework can be split into three parts:
- The first part is the discovery and connection. How does my phone find this device in my house that can support this application that I want to use, and how do I connect to that?
- The next part is the transfer of messages back and forth, whether they’re a custom message for a developer or something more.
- The framework has these APIs that are built on top of it for us developers to use for making some really cool apps. We’ve got the Media framework, Games, and Remote Display. Now these APIs are built on top of message transfer protocol.
User Experience (2:48)
Starting with the UX, Google did a really good job from the start of trying to have a consistent user experience. For all the apps that are out there, they all function the same way. The icon shows up in similar spots, and you can expect the same things as you’re interacting with the platform. They also published UX guidelines, which are handy for keeping in line with those expectations and then also making sure you have all the components that a user’s expecting to see.
For a media app, what are these pieces? We have some examples on slide 7 and 8. The first thing we have is an introductory cling. That was probably more important with some of the initial apps, but think of yourself as a user. You open your app and all of a sudden you have this new icon in the upper right corner. What does that do? The introductory cling can explain what you’re supposed to do with it.
Another component is the mini controller at the bottom of the screen. That should be present anytime you’re casting content and then have navigated away from it, so you’re somewhere else in the app. Clicking on the mini controller will take you to an expanded view, and it would give you more options. For example, a forward and backward buttons, and a couple of other things.
Then you’ve also got dialog controls, which you would open from clicking on the Cast icon.
Now you might think that’s all we have to include, but nope, we also have a Cast selection dialog, lock screen controls with a lock screen background image, and we also have this notification control that has to show up when we minimize the app. That’s a lot of stuff to add, so where should you start?
Where to start? (4:45)
We could start from scratch and implement all that stuff ourselves, but that’s a lot of work. Let’s try to share code. The first attempt at this was the Cast Companion Library and then more recently at I/O, the new Cast SDK v3 was announced.
Cast Companion Library has a large developer and app usage base. There are big benefits there because people have run into bugs or fixed issues, and it’s made its way back into the library. And it’s now officially a supported library.
When it first came out, it was just a sample. It takes care of a lot of the common boilerplate code for how to scan for devices, how to manage a session, etc. It also added some customization through asset overriding and some styling. It was rather limited, but over the past year, it’s gotten a lot better. One downside is it’s Android only, so while we were living the nice life of having all this out-of-the-box experience, iOS guys had to do this all from scratch with every project. They have said that they’re going to continue CCL, even though the Cast SDK v3 was announced.
What do we get in v3 of the SDK? Pretty much everything we had in CCL, except for the baked in, expanded widget. It’s Android and iOS. So now, iOS developers will get everything we had, and the APIs are the same between the platforms. It’s part of Play Services, which is really nice. You get that level of release support, bug support, and that normal update schedule. And it’s flexible. Every component that’s in there, that they use to build their widgets and everything like that is also meant for external developers to use and build off of. It’s very customizable; all the widgets that are in there have themes.
Now the first thing to use this SDK is we’ve got to configure it. To do that, we have this CastOptionsProvider
:
public class CastOptionsProvider implements OptionsProvider {
@Override
public CastOptions getCastOptions(Context context) {
return new CastOptions.Builder()
.setReceiverApplicationId("12345678")
.build();
}
@Override
public List<SessionProvider> getAdditionalSessionProviders(
Context appContext) {
return null;
}
}
The main thing that we need out there is that ApplicationId
. You get that after you register your app through the Dev Console. Then we have to add this entry into our manifest to let the framework know where this options provider is, so when it needs to configure it, it knows where to find it:
<!-- AndroidManifest.xml -->
<meta-data
android:name="com.google.android.gms.cast.framework.OPTIONS_PROVIDER_CLASS_NAME"
android:value="cast.demo.app.CastOptionsProvider"/>
The next thing we could do is add the Cast icon. Here we’ve got our main menu:
<!-- main.xml -->
<?xml version="1.0" encoding="utf-8"?>
<menu xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto">
<item
android:id="@+id/cast_route_item"
android:title="@string/cast_item_title"
app:actionProviderClass="android.support.v7.app.MediaRouteActionProvider"
app:showAsAction="always" />
</menu>
// MainActivity.java
@Override
public boolean onCreateOptionsMenu(Menu menu) {
super.onCreateOptionsMenu(menu);
getMenuInflater().inflate(R.menu.main, menu);
CastButtonFactory.setUpMediaRouteButton(this, menu, R.id.cast_route_item);
return true;
}
The important part is that it’s got this MediaRouteActionProvider
. Now this is the same part of the support library that if you were trying to connect to a Bluetooth device, you would also implement this and give it an actionProvider
.
We’ve got this CastButtonFactory
that we can use, and we can setUpMediaRouteButton
, give it the menu and the ID of the item, and that’s it. You can launch your application, and you’ll have the Cast icon up there. You can connect, you can have the same reconnection logic and all that. It’s done.
Ready to Connect (8:28)
MenuItem menuItem = menu.findItem(R.id.cast_route_item);
MediaRouteActionProvider actionProvider = (MediaRouteActionProvider)
MenuItemCompat.getActionProvider(menuItem);
CastContext castContext = CastContext.getSharedInstance(MainActivity.this);
MediaRouteSelector routeSelector = castContext.zzaij();
actionProvider.setRouteSelector(routeSelector);
What did we just do? The main part is we gave it the MenuItem that it needed. It grabbed the
actionProvider from it. Used this thing called a
CastContext, which we'll get into. And then got the
routeSelector from the
CastContext and set it on an
actionProvider`.
What is CastContext
? It’s a singleton object and it’s lazily loaded, so anytime we need it, and we request it with a context, it’ll go look for the options provider that we defined in our manifest, get our app ID from it, set everything up, and then we’ll be able to go from there. It’s the central point for accessing all of v3 SDK cast interface. It’s also got some pretty handy listeners. One of them is this CastStateListener
.
private CastStateListener castStateListener = new CastStateListener() {
@Override
public void onCastStateChanged(int castState) {
switch (castState) {
case CastState.CONNECTED:
case CastState.CONNECTING:
case CastState.NOT_CONNECTED: // devices available
case CastState.NO_DEVICES_AVAILABLE:
}
}
};
private AppVisibilityListener appVisibilityListener = new AppVisibilityListener() {
@Override
public void onAppEnteredForeground() { }
@Override
public void onAppEnteredBackground() { }
};
Once you’ve registered that and set it up and initialized it, you’ll get these callbacks to know if you connected to a device. If you’re not connected, that means there are devices around you, but you just haven’t done anything with them. Then you can also know if there are no devices anywhere near you.
Then we also have this appVisibilityListener
. This is mainly so it knows when to show that notification.
castContext = CastContext.getSharedInstance(this); // any context, anywhere
castContext.addCastStateListener(castStateListener);
castContext.addAppVisibilityListener(appVisibilityListener);
But it can also be handy for some other things. Anything else you need to know, if your app entered the foreground or if it entered the background. You get the castContext
from anywhere you need it, and you add the state listener or add the visibility listener.
SessionManager sessionManager = castContext.getSessionManager();
// SessionManagerListener<CastSession> - Session Lifecycle Callbacks, Errors
sessionManager.addSessionManagerListener(
sessionManagerListener, CastSession.class);
castSession = sessionManager.getCurrentCastSession();
The castContext
also has the sessionManager
. That’s all the stuff that you’re going to need for that actual session, such as knowing if you connected successfully, was the connection lost, where there any errors, etc.
It also has the castSession
. That’s the piece that we’re really after. castSession
has all the application info that we need, such as what’s the status of it, the app name, and anything like that. It also has all the device info, so what’s this device’s name, what it supports, etc. It also gives us access to the RemoteMediaClient
and to the base messaging framework.
What we’re mainly interested in this is the RemoteMediaClient
. To do anything with that, though, we’re going to need to do some configuration back in our CastOptionsProvider
. The first thing we’re going to do is build out this notification options; we’re going to set the actions that we want to support:
// CastOptionsProvider.java
NotificationOptions notificationOptions =
new NotificationOptions.Builder()
.setActions(Arrays.asList( // up to 5
MediaIntentReceiver.ACTION_TOGGLE_PLAYBACK,
MediaIntentReceiver.ACTION_SKIP_NEXT,
MediaIntentReceiver.ACTION_DISCONNECT
), new int[] { 0, 2 }) // Show in compat (condensed) view
.setSkipStepMs(NotificationOptions.SKIP_STEP_TEN_SECONDS_IN_MS)
.setTargetActivityClassName(
ExpandedControlsActivity.class.getName())
.build();
You can support up to five actions. They have some baked in ones, like toggle, playback, and skip. You could add your own custom ones, but then you’ll have to handle that a little bit differently later.
You also see that we pass in this array with some integers, new int[] { 0, 2}
. That’s which actions do you want to show when that view is in its compact form, which is like the condensed form in the notification. You can also set the skip time. So anytime previous or rewind or skip is selected, that’s how far it will go. Then you can configure the target activity that you want to be launched if notification is clicked on.
The next piece that we need to add are CastMediaOptions
:
// CastOptionsProvider.java
CastMediaOptions mediaOptions = new CastMediaOptions.Builder()
.setNotificationOptions(notificationOptions)
.setExpandedControllerActivityClassName(
ExpandedControlsActivity.class.getName())
// careful!
//.setMediaIntentReceiverClassName(CustomReceiver.java)
.setImagePicker(new ImagePickerImpl())
.build();
castOptionsBuilder.setCastMediaOptions(mediaOptions)
.build();
The first thing we’ll do is we’ll set our notification options that we just created. Then we also need to add our expanded controller activity again. This is so for anything outside of the notification, for example, the mini controller, that needs to launch your activity and you can get access to it.
There’s also this MediaIntentReceiver
. If you add a custom action to the notification, you can override that, receive the intent and do some processing with it. The thing you want to watch out for, though, is not all routes might go through your custom receiver. If you need to do some special action before playing media, and you register this, just be aware that not everywhere in the app that you’re trying to play media is going to go through this path, and that might be something you have to deal with.
The last thing we could do is set the ImagePicker
:
public class ImagePickerImpl extends ImagePicker {
@Override
public WebImage onPickImage(MediaMetadata mediaMetadata, int type) {
int imageIndex;
if (type == IMAGE_TYPE_MEDIA_ROUTE_CONTROLLER_DIALOG_BACKGROUND) {
imageIndex = 0;
} else {
imageIndex = 1;
}
if (mediaMetadata.hasImages() && mediaMetadata.getImages().size() > imageIndex) {
return mediaMetadata.getImages().get(imageIndex);
}
return null;
}
}
ImagePicker
is pretty cool, because in the past, CCL and some other libraries had trouble figuring out what image it was supposed to show at what time. They added this ImagePicker
that we could call anytime the framework needs an image for a location, or a widget, that they provide, and it’ll pass you the media metadata and then a type.
Now we’ve configured our media. What did that give us? We now have the hardware button controls for volume. We have image selections. We can provide the right images for our backgrounds. We have a notification that should work anytime something’s playing, and we have a lock screen background. All of that just by configuring the CastOptionsProvider
.
Create a Media Object (13:54)
Let’s create a media object so that we can Cast it. The first thing we’re going to do is create this media metadata object.
MediaMetadata metadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_GENERIC);
metadata.putString(MediaMetadata.KEY_TITLE, "Title");
WebImage thumbnail = new WebImage(Uri.parse("http://demo/thumbnail.png"));
WebImage background = new WebImage(Uri.parse("http://demo/background.png"));
metadata.addImage(thumbnail);
metadata.addImage(background); // ordering is important!
I’ve chosen to provide it a TYPE_GENERIC
. There’s some other built in ones, like movies and shows. All that does is it’s got a group of metadata fields that you can set on it, and that other senders will be expecting.
We give it a title, create some images, a thumbnail, and a background. And we’ll go ahead and add those to the metadata
. Now remember the ordering is important because all your platforms that implemented that ImagePicker
are going to be expecting those images in that exact spot.
You might have some metadata that’s more complicated than one of those canned versions. What you could do here is you can provide a metadata type of USER
:
MediaMetadata metadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_USER + 1);
metadata.putString(MediaMetadata.KEY_TITLE, "Title");
// ...
MediaInfo info = remoteMediaClient.getMediaInfo();
String title =
info.getMetadata().getString(MediaMetadata.KEY_TITLE);
TextUtils.isEmpty(title); // empty!
Right now that constant is set at 100. Everything below that is reserved for the framework, so we add +1 to it. You can’t override anything below, or if you do, they could get stomped on.
The next thing we’re going to do is add this title again. The only issue with this, though, is later, we might go and we ask the remoteMediaClient
to get the media that’s playing. We’re going to go through the metadata, try to get the title, and then we’re going to see if we have a title, and it’s going to be empty.
The reason for that is the defined keys are reserved for the defined media types. If you try to use any of the media metadata keys with a user media type or anything higher than that, they’re not going to show up. You will see them on the receiver in the debug menu, but in your application, and when you ask for it, you’ll not get it.
The next part is we need a content ID:
String contentId = “http://demo/test.mp4";
MediaInfo mediaInfo = new MediaInfo.Builder(contentId)
.setContentType("video/mp4")
.setStreamType(MediaInfo.STREAM_TYPE_LIVE)
.setMetadata(metadata)
.build();
Now normally this is a URL. It can also be any identifier that the receiver can use to look up this media and figure out what it’s supposed to play. Then we’re going to use MediaInfo.Builder
, give it the ID, set a content type to be a good citizen, and we’ll tell it the stream type, set our metadata and build it. This mediaInfo
object is what I like to call a core object.
What do I mean by core object? It is broadcast to all senders. So even though, you’re at home, and you’re on your own WiFi, it’s not the most consistent network conditions. You can have senders coming and going, especially on the iOS side. As soon as they have backgrounds, and they come back, they need a status update. And when they connect, they’re going to get any playing media broadcast to themselves. That media info object needs to have all the necessary information that they need to display the UI.
Normally you’d have your title that it needs to display in the metadata, but you might have other stuff. For example, if you’re building a sports app, you might have some logos that you need to load based off of what teams are playing. All that stuff needs to be in there. For the remoteMediaClient
, that’s their media info object, but for the Game Manager that would be the game state, which is just a big JSON object that’s broadcast to everyone.
How do you add this to it? All of your media info objects have this field for custom data:
String contentId = “http://demo/test.mp4";
MediaInfo.Builder mediaInfoBuilder = new MediaInfo.Builder(contentId)
.setContentType("video/mp4")
.setStreamType(MediaInfo.STREAM_TYPE_LIVE)
.setMetadata(metadata);
JSONObject customJson = new JSONObject("{\"homeTeam\": \"\", \"awayTeam\": \"\"}");
infoBuilder.setCustomData(customJson);
Custom data can be some blob of JSON. Whatever you need to fit in there. To load that media, we’re going to get a remoteMediaClient
from our castSession
and load our mediaInfo
:
RemoteMediaClient remoteMediaClient = castSession.getRemoteMediaClient();
remoteMediaClient.load(mediaInfo);
remoteMediaClient.load(mediaInfo, autoStart, startPosition);
remoteMediaClient.load(mediaInfo, autoStart, startPosition, customJson);
We can even load our mediaInfo
, tell it if we want it to start, auto-start or not, and we can give it a start position. Maybe they already watched 30 seconds into the clip, and they can start there. Another thing we could do is there is another hook for some custom data. For a lot of the calls in the framework, you’ll see this JSON custom field that’ll let you add whatever you need.
For updating the UI, something you might try to do is get the remoteMediaClient
, call pause, and then immediately show your pause UI:
RemoteMediaClient remoteMediaClient = castSession.getRemoteMediaClient();
remoteMediaClient.pause();
showPause(); // don't do this!
Don’t do this. Remember, we were talking about the network maybe not being the most reliable. We don’t know if the receiver received our pause command. If you go ahead and do this, then the state you’re showing on the phone isn’t going to match what the TV’s doing.
Instead, what we want to add is this RemoteMediaClientListener
:
private class RemoteMediaClientListener implements RemoteMediaClient.Listener {
@Override
public void onSendingRemoteMediaRequest() {
showSpinner();
}
@Override
public void onStatusUpdated() {
MediaStatus mediaStatus = remoteMediaClient.getMediaStatus();
switch (mediaStatus.getPlayerState()) {
case MediaStatus.PLAYER_STATE_PAUSED:
showPause();
// ...
}
}
// …
RemoteMediaClient remoteMediaClient = castSession.getRemoteMediaClient();
remoteMediaClient.addListener(new RemoteMediaClientListener());
remoteMediaClient.pause();
That is going to have a callback for when our status is updated, and it’s also going to have a callback for when we send out a media request. That’ll let us know if we need to show a spinner. We can go ahead and set that up. When our status is updated, we can see what status it is, and then update our UI. We’d set our listener and then call pause.
This is the hard way of updating your UI. With the new SDK, we now have the UIMediaController
:
// centralizes SessionManagerListener, RemoteMediaClientListener
UIMediaController uiMediaController = new UIMediaController(MainActivity.this);
// allows binding views to state
uiMediaController.bindImageViewToImageOfCurrentItem(
playingThumbnail,
ImagePicker.IMAGE_TYPE_MINI_CONTROLLER_THUMBNAIL,
R.drawable.cast_mini_controller_img_placeholder);
uiMediaController.bindViewVisibilityToPreloadingEvent(upNextLayout, View.GONE);
// called after all other RemoteMediaClientListener
uiMediaController.setPostRemoteMediaClientListener(new RemoteMediaClientListener());
With the UIMediaController
, you create it anywhere, passing it a context. It wraps the SessionManagerListener
and our RemoteMediaClientListener
. Now all the callbacks that you would get from those, you can get from this single object, which is especially handy for expanded controls, because that’s all the stuff that it also needs. It introduces this cool paradigm for binding image views to different states of the Cast session, or any views.
In this case, we’re binding a view to whatever the current item is playing. So this could be our thumbnail. We could do it for pretty much anything. We’re also providing what type that is for the ImagePicker
. They also let you bind the visibility of a view. If a receiver is pre-loading the next item that’s going to play, you could show your upNextLayout
. It’s called after all other listeners are called. That is nice in case you need to do something else. Maybe you don’t necessarily want the UI to update until the final state is there.
You want to avoid custom messages for any media related items. The media channel is really designed to handle these. For example, on a slideshow, you might think, “Oh, I need to add some next picture command.” and send that out. Well, you could still use the media channel and use a skip or next command, like before, and that helps keep all the senders in sync.
But if you do need a custom media channel, you can go ahead and register one:
Cast.MessageReceivedCallback callback = new Cast.MessageReceivedCallback() {
@Override
public void onMessageReceived(CastDevice castDevice, String namespace, String message) {
// handle message
}
};
String namespace = "urn:x-cast:com.custom.cast";
try {
castSession.setMessageReceivedCallbacks(namespace, callback);
} catch (IOException e) {
// error!
}
// urn:x-cast:com.google.cast.media
First, you create the callback. You have to define your namespace. It needs to begin with that urn:x-cast
part, and then whatever domain you want to give it so that you can get those messages. Accept the callback and you are good to go.
This is the exact same thing the remoteMediaClient
and the Game Manager are doing. You can see there at the bottom, the remoteMediaClient
is just using that namespace for media stuff, and then it’s sending all the play, pause, skip, and everything over that message namespace.
Ads + queue (21:33)
How would you store ads in a queue at the same time? We do a similar thing, where we go through and create our ad media info:
MediaMetadata adMetadata = new MediaMetadata(MEDIA_TYPE_AD);
MediaInfo adInfo = new MediaInfo.Builder(adId)
.setContentType("video/mp4")
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setMetadata(adMetadata)
.build();
MediaQueueItem adItem= new MediaQueueItem.Builder(adInfo).build();
MediaQueueItem mediaItem = new MediaQueueItem.Builder(mediaInfo).build();
MediaQueueItem[] items = new MediaQueueItem[] { adItem, mediaItem };
remoteMediaClient.queueLoad(items, 0, MediaStatus.REPEAT_MODE_REPEAT_OFF, customJson);
We can give it our metadata so we can identify it later. Then we wrap those in these MediaQueueItems
, so we’ll have an adItem
and a mediaItem
, and then we’ll build our queue, which is an array of those items. Then call queueLoad
with the items
, which ones start with, and then the repeat mode that we want it to do. You can set up a queue so that it endlessly repeats, or have it off, and there are some other ones.
How do we tell that an ad is playing? We’d have our client list in our setup, and then on our metadata updated. We would get the media info for the current item, and we can check the MediaType
:
private class RemoteMediaClientListener implements RemoteMediaClient.Listener {
@Override
public void onMetadataUpdated() {
MediaInfo info = remoteMediaClient.getMediaInfo();
if (info.getMetadata().getMediaType() == MEDIA_TYPE_AD) {
hideSeekBar();
disablePause();
}
// ...
}
// ...
If it’s the type ad that we set up, we can go ahead and hide the seek bar or disable pause or whatever else we need to do. The receiver can also build this queue for you. You can have that logic totally remote; you could send up the five videos that you want to play and have the receiver inject into the queue the ads in between. The receiver again, just like with regular media, could go and retrieve those ad URLs, and it can listen for ID3 tags and streams and fire ad listener events.
Another thing the receiver can do for you is adjust the playback rate. When you ask for the current position of the media that’s playing, if an ad’s playing, maybe you don’t want that time to advance. The receiver can set the playback rate, and what that does is it’s used as a multiplier. It can say, “Okay, 10 seconds have past,” and the playback rate can be set to zero, so it’ll cancel out. It’ll look like your current time is still a minute into the video.
Tips
Authentication tips: keep the senders authorizing the content. You’ve already built these intricate login flows and handled a bunch of use cases. Keep the gatekeeper on the senders. Let the receiver do any kind of authentication you need to do. I know this could be hard, because sometimes tokens only last for about 30 seconds, but you should really try and ask yourself, what is your average session length going to be? Try to make the token last as long as that.
The receiver can send the auth token along as cookies. You can add a custom header for it. Or you can even add a query parameter if that’s got the token or some credentials in there. The auth token, the load command’s custom data is a really handy thing to use for passing that auth token.
Analytics: the first thing I’d say is check out the Dev Console. They recently updated, and it’s got a bunch of analytics in there. It’s got the device in session count and average playback time. You can filter it by past 28 days, past 7 days, etc. It also breaks down what country and what device type the user had.
Now if that doesn’t work for you or your client, then some things to consider are, what kind of analytics are you trying to collect and where do they fit best? The sender is really good for stuff like did it encounter a receiver ever and different interaction with the UI. It’s not good for doing things like session length, because like iOS, if you background it, your session just ended. You do have a token for it that you could restore it, but it’s a lot harder to keep track of.
The receiver is running for the entire session. It’s really good for determining how many devices connected to me during that session, how many videos were played in that session, and anything like that. It’s definitely the best thing for media events. Play, pause, did the video actually end? All that stuff that you want to keep there.
A lot of times, you’ll ask the analytics company if they have a Chromecast SDK. Your first response you are going to get is, no, they don’t. Then you ask them do they have an HTML5 or JavaScript SDK? And they will respond, yes, of course we do. Well, those SDKs probably run on the receiver, so it’s always worth giving it a shot. Some other ones are like are Conviva which hooks into media element events. The receiver is using the media element and pumping video data to it, so anything that does that you can also use.
Another thing you should do for your content that you’re casting is validate if it’s good for it. Supported format is something you want to check. The content origin, where is this stuff hosted? Can the receiver play it? The biggest headache you’ll run into a lot of times is trying to play video, and you get some error that the website can’t load some JavaScript or some content from another domain. This stuff is left over from a long time ago, where it’s a very good security measure to keep websites from loading malicious content from another domain or anything like that.
The resolution that you’re using, is the orientation normalized? What I mean by that is if you’re doing a slideshow, and you want to show some photos up on the receiver, often devices take those at a different rotation. If you don’t normalize that beforehand, the receiver’s got to know that and be able to flip it and adjust the rotation by looking at the EXIF data, which is very hard to do on the receiver. It’s also hard to do on the senders. If you could do that beforehand with some pre-processing and then cast it, that’s always good.
Another thing is to use the demo apps. You can get the Android or iOS demo app and swap out the URL to some of your content in there and cast it to their sample media player. Their sample receiver can play any of the content now that they support. So if it plays your stream, then you’re good to go.
Guest mode (28:21)
Let’s talk a little bit about guest mode. Earlier I talked about the discovery and connection part. Usually, that’s on WiFi, but now, with the second generation stuff, we also have Bluetooth LE for the devices. We also have for Android audio tokens that could be broadcast at a really high pitched sound between the TV and your device and be used to connect to them without being on the WiFi. There’s also a fallback PIN authentication that you can use.
When you’re doing this, you’ll notice you click the Cast icon, and the route will show up as a nearby device in there. The other thing to keep in mind is this is transparent to your app. You don’t have to do anything as a developer to get this support. The other thing to keep in mind, though, is that it does go through a cloud relay. If you’re hosting any local content, you won’t be able to support this.
Audio devices use the same API as the Remote Media player. If there is no reason not to support it, then you could go into the Dev Console and opt in. On the senders, when you connect to a device, you can check if that device has certain capabilities:
// sender
CastDevice castDevice = castSession.getCastDevice();
castDevice.hasCapability(CastDevice.CAPABILITY_VIDEO_OUT);
castDevice.hasCapability(CastDevice.CAPABILITY_AUDIO_OUT);
castDevice.hasCapability(CastDevice.CAPABILITY_VIDEO_IN);
// receiver, Chromecast Audio
castReceiverManager.getDeviceCapabilities()
Object { bluetooth_supported: true, display_supported: false, hi_res_audio_supported: true
}
// server request header
CAST-DEVICE-CAPABILITIES: {“display_supported": false}
In here, we’d take the Cast session. We get the Cast device, and then we ask, does it have support for video out? Does it have support for audio out? One thing I noticed recently that’s interesting is they had these video in and audio.
On the receiver side, you can also get the device capabilities for the receiver that are on the device you’re running on. Again, you’ll see the same thing, is Bluetooth supported, display, and audio. Another thing you can do is when the request is made to load your receiver app, there will be a header in that request. You can check that and actually load two different receivers, depending on if you’re going to run on an audio device or if you’re going to run on one with a display.
Debugging / Conclusion (30:32)
A few debugging tips. Even if you’re not a web developer or anything, the Chrome debugger is your friend. It’s going to be telling you exactly what’s being broadcast to all your senders, and then you can check if you’re seeing that in your app.
They’ve made it really easy to get to the debugger now. You do Chrome Inspect if you’re on that network, and any of the devices on that network will show up. While you’re inspecting, you can type in these logger level commands and set everything to debug, so you can see what’s going on. In the middle of the session, you could reload the whole page. You could pass true, so that clears the cache in case there was a receiver update, to make sure you’re working with the newest device.
Often, when I’ve seen the Chromecast reboot, for example, in the middle of testing something, it’s usually a memory issue. One thing with that is, if you’re sending images for your slideshow, and you’re sending them at a very, very high resolution, Chromecast is only going to display those at 720p. You don’t need to send a giant picture to it.
Another thing you could do is enable crash logs in the device settings for the Cast app. Then if you need access to those crash logs, you can reach out to Google and ask them for that.
Some common problems with whitelisting, if you’re unable to debug a device when you go to Chrome Inspect. If no apps show up, you’re probably not whitelisted for it. What I mean by whitelisted is in the Dev Console, you have to register the serial number for devices that you want to be able to debug and work with your app. Also, if you’re hitting that Cast icon, and there are no devices showing in there, then chances are you’re not whitelisted.
Q & A (40:58)
Q: If you had multiple senders connected to the same receiver, how would you manage a queue of these items?
Caleb: There are commands for inserting and removing, and you can give it specific indexes. Every time you do that, the queue is then re-broadcast out to all the senders. They should be able to get an updated version of it and add or remove items themselves. You still might run into some issues where they do it at the exact same time, in which case it’s going to be a whoever got there first kind of thing.
Q: What receiver device do you develop against?
Caleb: I, at home, still use the first gen device. It’s the one that’s closest to me and actually plugged in the monitor. But the Nexus players, since they’re a receiver device, they’re very friendly to work with. Since it’s the debugging’s over USB, instead of through your browser, it’s much quicker and seems to be much more responsive. I would definitely recommend something that’s plugged in over ADB so that you can debug.
Receive news and updates from Realm straight to your inbox