<SYSTEM>This is the developer documentation for Mux Video.</SYSTEM>

# Migrate from self-managed video infrastructure to Mux
A step-by-step guide for teams migrating from custom, multi-service video infrastructure (AWS MediaConvert, S3, CloudFront, FFmpeg, etc.) to Mux.
This guide is for teams that have built their own video infrastructure by stitching together multiple services — custom transcoding, object storage, general-purpose CDNs, and workflow orchestration — and are considering a move to Mux.

If your current setup involves three or more services and workflows for encoding, storage, delivery, analytics, and playback, this guide will walk you through migrating to Mux.

<Callout type="info">
  If you're building a new video product from scratch rather than migrating, check out our [Mux fundamentals](/docs/core/mux-fundamentals) guide instead.
</Callout>

<Image src="/docs/images/migration-flow.png" width={1536} height={356} alt="Migration timeline showing 5 phases: Assess your current setup, Set up Mux, Test with a small batch, Migrate your full library (with option to run both platforms in parallel), and Cut over to Mux." />

## 1. Assess your current setup

Before migrating, take stock of your current video infrastructure. A typical DIY video workflow involves many different processes across multiple services. Mux replaces this entire stack with a single API:

* **[Direct uploads](/docs/guides/upload-files-directly)** — Ingest video from your users or your storage with no separate upload server or bucket to manage
* **[Automatic transcoding](/docs/guides/use-video-quality-levels)** — Every asset is encoded into multiple renditions with adaptive bitrate streaming out of the box
* **[Thumbnails and previews](/docs/guides/get-images-from-a-video)** — Generate thumbnails, animated GIFs, and storyboards from any point in a video
* **[Auto-generated captions](/docs/guides/add-autogenerated-captions-and-use-transcripts)** — Transcription and captioning built into the ingest pipeline
* **[Webhooks](/docs/core/listen-for-webhooks)** — Real-time notifications for asset status so you can update your CMS or database automatically
* **[Multi-CDN delivery](/docs/guides/play-your-videos)** — Global delivery without managing CDN configurations or origin servers
* **[Mux Player](/docs/guides/mux-player-web)** — Drop-in player for web, iOS, and Android with built-in analytics
* **[Mux Data](/docs/guides/data)** — Video analytics including quality of experience, engagement, and performance metrics

## 2. Get set up with Mux

To start using Mux, follow these steps. You can sign up for a free account.

1. **[Sign up](https://dashboard.mux.com/signup)** for a Mux account.
2. **Review [Mux fundamentals](/docs/core/mux-fundamentals)** to understand core concepts.
3. **Generate [API credentials](https://dashboard.mux.com/settings/access-tokens)** and install the [Mux SDK](/docs/core/sdks) for your stack.
4. **Upload a test video** and verify playback works end-to-end.
5. **[Set up webhooks](/docs/core/listen-for-webhooks)** and **[integrate Mux Player](/docs/guides/mux-player-web)** (or use [any HLS player](/docs/guides/play-your-videos)).

## 3. Move your content

Mux supports both small and large libraries, so you can migrate any number of videos in a similar way.

### Option 1: Use Truckload

[Truckload](https://migrate.mux.dev) is a migration tool that handles bulk content import. It runs as a hosted service with no infrastructure setup required, or you can self-host and customize it for your specific migration needs since it's open source.

* Works with S3‑compatible storage, Vimeo, Cloudflare Stream, and other sources
* Truckload pulls your content, sends it through the Mux processing pipeline, and prepares it for playback
* View current status in the Truckload dashboard
* You can run the migration in waves or all at once
* Use the hosted instance for simplicity, or fork and modify the open source version for custom workflows

Truckload runs on [Inngest](https://www.inngest.com/), a background job orchestration platform.

### Option 2: Build your own migration flow

If you have specific requirements or if you're working with a CMS, You can run the migration yourself using the Mux API.

* To ingest your video files, call the <ApiRefLink href="/docs/api-reference/video/assets/create-asset">Create Asset API</ApiRefLink> with a URL pointing to your video files. Mux pulls and encodes them asynchronously.
* Whether you're on Contentful, Sanity, WordPress, or a custom CMS, store the returned Mux playback ID and asset ID against your existing content record in your CMS as each asset is processed.
* Listen for `video.asset.ready` webhook events and use them to automatically update the corresponding record in your CMS the moment an asset is ready for playback.
* Because you're updating records one by one as assets are processed, you can start serving content from Mux without waiting for the full migration to complete.
* Process your library in batches to stay within API rate limits, and use your CMS as the tracker for what's been migrated.

Here's an example migration script that lists videos in your S3 bucket and ingests them into Mux:

```node

import Mux from '@mux/mux-node';
import { S3Client, ListObjectsV2Command } from '@aws-sdk/client-s3';

const mux = new Mux();
const s3 = new S3Client({ region: 'us-east-1' });

// List videos in your S3 bucket
const { Contents: objects = [] } = await s3.send(
  new ListObjectsV2Command({
    Bucket: 'my-video-bucket',
    Prefix: 'videos/',
  })
);

for (const obj of objects) {
  const asset = await mux.video.assets.create({
    inputs: [{ url: `https://my-video-bucket.s3.amazonaws.com/${obj.Key}` }],
    playback_policies: ['public'],
    video_quality: 'basic',
    passthrough: obj.Key, // Link back to your S3 key
  });

  console.log(`Created asset ${asset.id} for ${obj.Key}`);

  // Store asset.id and asset.playback_ids[0].id in your database
}

```



## Things to keep in mind

<Callout type="warning">
  **Test first.** Always test your migration flow with a small batch of sample content before migrating your full library.
</Callout>

### Choose your video quality tier

Mux offers multiple [video quality tiers](/docs/guides/use-video-quality-levels) to balance quality and cost based on your needs:

* **Basic** - Optimized for cost efficiency while maintaining solid quality for most use cases
* **Plus** - Higher bitrate encoding that delivers improved visual quality, especially for detailed content
* **Premium** - Premium quality encoding designed for content where visual fidelity is critical

You can set a default quality tier for your entire library and override it on a per-asset basis when needed. Basic and Plus are commonly used for general content, while Premium is useful when higher visual fidelity is required.

### API rate limits

Understand Mux API rate limits for bulk operations during migration. For specific limits and `429` handling guidance, see [API rate limits](/docs/core/make-api-requests#api-rate-limits). When migrating a large library, use a background job queue to keep within limits and get reliable, retryable job processing.

### Job management for bulk migrations

Enqueue a job per video, control concurrency, and let the queue handle retries on failure. Rather than polling Mux to check asset status, listen for the `video.asset.ready` webhook and trigger a job to update your database or CMS when it fires.

## 4. How pricing compares

A DIY stack involves multiple services and line items — storage, bandwidth, transcoding, compute, and engineering overhead. Mux uses a single usage‑based pricing model.

### Pricing model comparison

| Cost dimension | Self-managed | Mux |
| --- | --- | --- |
| **Storage** | Per-GB object storage (S3 or equivalent), billed monthly for all stored data | Per-minute of stored video — no GB math needed. See [Video pricing](/docs/pricing/video) |
| **Delivery** | Per-GB bandwidth from your CDN (CloudFront, Cloudflare, etc.), tiered by region | Per-minute of delivered video, including multi-CDN. See [Video pricing](/docs/pricing/video) |
| **Transcoding** | Per-minute or per-job from your transcoding service (MediaConvert, etc.), varies by codec and resolution | Per-minute of input video at time of encoding — one price regardless of output renditions. See [Video pricing](/docs/pricing/video) |
| **Player** | Build your own or license a third-party player | [Mux Player](/docs/guides/mux-player-web) included at no additional cost |
| **Analytics** | Separate service or build your own | [Mux Data](/docs/guides/data) included with every stream |
| **Engineering overhead** | Ongoing maintenance of orchestration, monitoring, error handling, and scaling across services | No infrastructure to manage or scale. You maintain an integration only |

### Working example: 1,000 videos at 5 minutes average

To estimate costs with a DIY stack, you'd calculate separate line items for each service:

* **Transcoding**: 5,000 minutes of input × your transcoding service's per-minute rate, multiplied by the number of output renditions
* **Storage**: total GB across all renditions × your storage provider's per-GB monthly rate
* **Delivery**: estimated viewer-minutes converted to GB (varies by bitrate and resolution) × your CDN's per-GB rate, tiered by region
* **Compute**: Lambda invocations, workflow orchestration, monitoring, etc. This varies widely depending on your architecture
* **Engineering time**: the ongoing cost of maintaining, debugging, and scaling this infrastructure

With Mux, you'd estimate three line items:

* **Encoding**: 5,000 minutes of input video × the per-minute encoding rate for your chosen [quality tier](/docs/guides/use-video-quality-levels) (free encoding if using Basic)
* **Storage**: 5,000 minutes of stored video × the per-minute storage rate
* **Delivery**: estimated viewer-minutes × the per-minute delivery rate

<Callout type="info">
  If you're converting from GB-based billing, these approximations can help: 720p ≈ 40 min/GB, 1080p ≈ 25 min/GB, 2K ≈ 15 min/GB, 4K ≈ 10 min/GB. See [Estimating video costs](/docs/pricing/estimating-video-costs) for more detail.
</Callout>

Use the [Mux pricing calculator](/pricing/calculator) to plug in your actual numbers and get a cost estimate. For tips on keeping costs optimized for your use case, see [Optimizing video costs](/docs/pricing/optimizing-video-costs).

## 5. Migration checklist

Use this checklist to track your migration progress.

### Setup phase

* \[ ] Mux account created and API credentials generated
* \[ ] SDK installed and initial API request tested
* \[ ] Test video uploaded and playback verified
* \[ ] Mux Player integrated into your application
* \[ ] Webhooks configured for `video.asset.ready`

### Content migration

* \[ ] Migration approach determined (Truckload or custom API)
* \[ ] Test migration completed with sample content
* \[ ] Preferred video quality settings determined
* \[ ] Job queue configured for bulk operations (if using API approach)
* \[ ] Mux playback IDs stored in your database or CMS
* \[ ] Video metadata carried over (e.g. `title`, `creator_id`, `external_id`) using [asset metadata](/docs/guides/add-metadata-to-your-videos)
* \[ ] Full content library migrated and verified

## 6. What's next

Once your content is migrated, you can access additional features available on the platform.

### AI workflows

Now that your video is in Mux, you can access the primitives of each asset (audio, captions, summaries, key moments) to trigger AI-driven workflows:

* [Create transcripts from your asset's audio](/docs/guides/add-autogenerated-captions-and-use-transcripts)
* [Generate video chapters](/docs/examples/ai-generated-chapters)
* [Generate titles, descriptions, and tags for your videos](/docs/examples/ai-summarizing-and-tagging)
* [Moderate your content to detect inappropriate material](/docs/examples/ai-moderation)

### Security

* **[Signed URLs](/docs/guides/secure-video-playback)**: Gate access to your content with time-limited, signed playback URLs. Ideal for paywalled content or private videos.
* **[DRM](/docs/guides/protect-videos-with-drm)**: Enterprise-grade content protection using Widevine, FairPlay, and PlayReady encryption.

### Live streaming

* **[Live streaming](/docs/guides/start-live-streaming)**: Stream live with low latency, built for interactive experiences.
* **[Simulcasting](/docs/guides/stream-live-to-3rd-party-platforms)**: Forward your live stream simultaneously to YouTube, Facebook, Twitch, and any RTMP-compatible destination.

### Media generation

* **[Thumbnails and animated GIFs](/docs/guides/get-images-from-a-video)**: Generate a thumbnail or animated preview from any point in any video.
* **[Video clipping](/docs/guides/create-clips-from-your-videos)**: Create clips from your videos instantly.
* **[Static MP4 renditions](/docs/guides/enable-static-mp4-renditions)**: Create MP4 versions for offline downloads or short-form social use cases.

### Video analytics

* **[Mux Data](/docs/guides/data)**: Track viewer engagement with plays, unique viewers, and watch time. Monitor quality of experience metrics like startup time, rebuffering, playback success, and video quality which are then segmented by device, region, CDN, and more.


# Use different video quality levels
Learn how to pick an appropriate video quality and control the video quality of assets.
<Callout type="warning" title="Encoding tiers have been renamed to video quality levels">
  [We recently renamed encoding tiers to video quality levels. Read the blog for more details.](https://www.mux.com/blog/no-one-said-naming-was-easy-encoding-tiers-are-now-video-quality-levels)
</Callout>

## Introducing video quality levels

Mux Video supports encoding content with three different video quality levels. The video quality level informs the quality, cost, and available platform features for the asset.

### Basic

The *basic* video quality level uses a reduced encoding ladder, with a lower target video quality, suitable for simpler video use cases.

There is no charge for video encoding when using basic quality.

### Plus

The *plus* video quality level encodes your video at a consistent high-quality level. Assets encoded with the plus quality use an AI-powered per-title encoding technology that boosts bitrates for high-complexity content, ensuring high-quality video, while reducing bitrates for lower-complexity content to save bandwidth without sacrificing on quality.

The plus quality level incurs a [cost per video minute of encoding](https://mux.com/pricing).

### Premium

The *premium* video quality level uses the same AI-powered per-title encoding technology as plus, but is tuned to optimize for the presentation of premium media content, where the highest video quality is required, including use cases like live sports, or studio movies.

The premium quality level incurs a higher [cost per video minute of encoding, storage, and delivery](https://mux.com/pricing).

## Set a video quality level when creating an asset

The video quality of an asset is controlled by setting the `video_quality` attribute on your <ApiRefLink href="/docs/api-reference/video/assets/create-asset">create-asset API call</ApiRefLink>, so to create an asset with the `basic` quality level, you should set `"video_quality": "basic"` as shown below.

```json
// POST /video/v1/assets
{
	"inputs": [
		{
			"url": "https://storage.googleapis.com/muxdemofiles/mux.mp4"
		}
	],
	"playback_policies": [
		"public"
	],
	"video_quality": "basic"
}
```

And of course you can also select the `video_quality` within Direct Uploads, too; you just need to set the same `"video_quality": "basic"` field in the `new_asset_settings` of your <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">create-direct-upload API call</ApiRefLink>.

```json
// POST /video/v1/uploads
{
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "video_quality": "basic"
  },
  "cors_origin": "*"
}
```

## Set the video quality when creating a live stream

To set the `video_quality` for a live stream, you just need to set the `"video_quality"` field within the `new_asset_settings` configuration of your <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">create-live-stream API call</ApiRefLink> to `"plus"` or `"premium"`, as shown below.

```json
// POST /video/v1/live-streams
{
  "playback_policies": [
    "public"
  ],
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "video_quality": "plus"
  }
}
```

All on-demand assets created from the live stream will also inherit the given quality level. Live streams can currently only use the `plus` or `premium` video quality levels.

## Supported features

Assets using different video quality levels have different features or limits available to them. Refer to the below table for more details:

| Feature | Basic | Plus | Premium |
| :-- | :-- | :-- | :-- |
| [JIT encoding](https://www.mux.com/features#just-in-time-encoding) | ✅ | ✅ | ✅ |
| [Multi CDN delivery](https://www.mux.com/blog/multi-cdn-support-in-mux-video-for-improved-performance-and-reliability) | ✅ | ✅ | ✅ |
| [Mux Data included](https://mux.com/data) | ✅ | ✅ | ✅ |
| [Mux Player included](https://mux.com/player) | ✅ | ✅ | ✅ |
| [Thumbnails, Gifs,](/docs/guides/get-images-from-a-video) [Storyboards](/docs/guides/create-timeline-hover-previews) | ✅ | ✅ | ✅ |
| [Watermarking](/docs/guides/add-watermarks-to-your-videos) | ✅ | ✅ | ✅ |
| [Signed playback IDs and playback restrictions](/docs/guides/secure-video-playback) | ✅ | ✅ | ✅ |
| [On-Demand](/docs/core/stream-video-files) | ✅ | ✅ | ✅ |
| [Master Access](/docs/guides/download-for-offline-editing) | ✅ | ✅  | ✅ |
| [Audio-only Assets](https://www.mux.com/blog/have-you-heard-we-support-audio-only-files-too) | ✅ | ✅ | ✅ |
| [Auto-generated captions](/docs/guides/add-autogenerated-captions-and-use-transcripts) | ✅ | ✅ | ✅ |
| [Clipping to a new asset](/docs/guides/create-clips-from-your-videos)| ✅ | ✅ | ✅ |
| [Multi-track audio](/docs/guides/add-alternate-audio-tracks-to-your-videos) | ✅ | ✅ | ✅ |
| [Live Streaming](/docs/guides/start-live-streaming) | ❌ | ✅ |  ✅ |
| [Adaptive bitrate ladder](https://www.mux.com/video-glossary/abr-adaptive-bitrate) | Reduced | Standard | Extended |
| [Maximum streaming resolution](/docs/guides/stream-videos-in-4k) | 2160p (4K) | 2160p (4K) | 2160p (4K) |
| [MP4 support](/docs/guides/enable-static-mp4-renditions) \* | ✅  | ✅ | ✅ |
| [DRM](/docs/guides/protect-videos-with-drm) <BetaTag /> | ❌ | ✅ | ✅ |

## Examples for comparison

| Encoding tier | Content complexity | Playback page |
| :-- | :-- | :-- |
| basic | Simple | [moodeng - basic](https://player.mux.com/elUDQkfDSv43pR8ejgrURIGcNDzx00Jgq) |
| basic | Complex | [waterfall - basic](https://player.mux.com/z00qLVB3NsE1FOE7mCKboHGXQDQIun4sp) |
| plus | Simple | [moodeng - plus](https://player.mux.com/Vl7wZGmm3ztT01VL8UzTv9QEfPleP4kMR) |
| plus | Complex | [waterfall - plus](https://player.mux.com/aTFGdFMLBgdvegF29Orvoj026mpkb8v6Q) |
| premium | Simple | [moodeng - premium](https://player.mux.com/JQ1P00AC3EjGfL7BjI951M006dvL4asVce) |
| premium | Complex | [waterfall - premium](https://player.mux.com/Q8Vk00DqGWTGS40201TtXROm67LNCMlfxed) |

\* MP4 pricing depends on the video quality level of your asset. [Learn more.](/docs/pricing/video#do-you-charge-for-mp4-downloads)


# Stream videos in 4K
Learn how to ingest, store, and deliver your videos in 4K resolutions
## Introduction to 4K

Mux Video supports ingesting, storing, and delivering on-demand video assets in high resolutions up to and including 4K (2160p). At this time, 2K and 4K output is not supported for Mux [Live Streaming](https://www.mux.com/docs/guides/start-live-streaming). Live Streams will accept 2K and 4K input, but the output will be capped to 1080p (and will be billed as 1080p).

Much premium video content is created in 4K, but more recently user-generated content is often in 4K as well, as mobile devices are increasingly capable of producing 4K video.

Mux Video also supports [2K and 2.5K video on-demand video assets](/docs/guides/stream-videos-in-4k#stream-2k-and-25k-content).

## Create a 4K asset

To ingest, store, and deliver an asset in 4K, you'll need to set the new `max_resolution_tier` attribute on your <ApiRefLink href="/docs/api-reference/video/assets/create-asset">create-asset API call</ApiRefLink>.

```json
// POST /video/v1/assets
{
	"inputs": [
		{
			"url": "https://storage.googleapis.com/muxdemofiles/mux-4k.mp4"
		}
	],
	"playback_policies": [
		"public"
	],
	"video_quality": "basic",
	"max_resolution_tier": "2160p"
}
```

This field controls the maximum resolution that we'll encode, store, and deliver your media in. We do not to automatically ingest content at 4K so that you can avoid unexpectedly high ingest bills. If you send us 4K content today and don't set `max_resolution_tier`, nothing changes in your bill.

This also allows you to build applications where some of your content creators are able to upload 4K content while others remain capped at 1080p.

And of course you can use 4K with Direct Uploads, too; you just need to set the same `"max_resolution_tier": "2160p"` field in the `new_asset_settings` of your <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">create-direct-upload API call</ApiRefLink>.

```json
// POST /video/v1/uploads
{
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "video_quality": "basic",
    "max_resolution_tier": "2160p"
  },
  "cors_origin": "*"
}
```

## Play your assets in 4K

For assets with 4K enabled at ingest, we'll automatically add 2K and 4K renditions to your HLS Playback URLs. Mux uses high-bitrate H.264 for delivering 4K content, which is supported on a wide range of devices, like Mux Player shown below.

<Player playbackId="fss00bwClhYMynhxeE2Hv757J02VI68KY5" thumbnailTime="0" title="4K Video Demo" />

While we've tested playback and built device detection rules that should protect you against unexpected playback failures, you should always test playback on your own device footprint before enabling 4K widely on your platform.

## Limiting playback resolution below 4K

Of course, you might not want all of your viewers to be able to view your content in 4K. Lots of streaming platforms choose to only offer 4K playback to their high subscription tiers. You can implement this by controlling playback resolution with the max\_resolution query parameter on your playback URLs, as shown below.

```
https://stream.mux.com/${PLAYBACK_ID}.m3u8?max_resolution=1080p
```

## Preparing 4K inputs

Mux uses just-in-time (JIT) encoding to make sure your assets are available as soon as possible after you create them, and this includes 4K assets.

Most of the usual restrictions for standard inputs still apply when you're using 4K, but there are a few different restrictions you should be aware of:

* The input must have a maximum dimension of 4096 pixels
* The input must have a maximum keyframe interval of 10 seconds
* The input must be 20 Mbps or less
* The input must have a frame rate between 5 fps and 60 fps

[You can find full details of the standard input specification for 1080p and 4K content in our documentation.](/docs/guides/minimize-processing-time#standard-input-specs)

## Stream 2K and 2.5K content

Mux Video also supports 2K and 2.5K (1440p) content. If you want your asset processed as 2/2.5K, you just need to set `"max_resolution_tier": "1440p"` in your create asset (or create direct upload) calls instead.


# Upload files directly
Allow your users to upload content directly to Mux.
## Overview

Direct Uploads allow you to provide an authenticated upload URL to your client applications so content can be uploaded directly to Mux without needing any intermediary steps. You still get to control who gets an authenticated URL, how long it's viable, and, of course, the Asset settings used when the upload is complete.

The most common use-case for Direct Uploads is in client applications, such as native mobile apps and the browser, but you could also use them to upload directly from your server or in a command line tool. Any time you don't feel the need to store the original on your own, just generate a signed URL and push the content directly.

Let's start by walking through the simplest use case of getting a file directly into Mux.

## Upload a file directly into Mux

### 1. Create an authenticated URL

The first step is creating a new Direct Upload with the Mux Asset settings you want. The Mux API will return an authenticated URL that you can use directly in your client apps, as well as an ID specific to that Direct Upload so you can check the status later via the API.

```curl

curl https://api.mux.com/video/v1/uploads \
  -X POST \
  -H "Content-Type: application/json" \
  -u MUX_TOKEN_ID:MUX_TOKEN_SECRET \
  -d '{ "new_asset_settings": { "playback_policies": ["public"], "video_quality": "basic" }, "cors_origin": "*" }'

```

```elixir

# config/dev.exs
config :mux,
  access_token_id: "MUX_TOKEN_ID",
  access_token_secret: "MUX_TOKEN_SECRET"

client = Mux.client()
params = %{"new_asset_settings" => %{"playback_policies" => ["public"], "video_quality" => "basic"}, "cors_origin" => "https://your-browser-app.com"}
Mux.Video.Uploads.create(client, params)

```

```go

import (
  muxgo "github.com/muxinc/mux-go"
)

client := muxgo.NewAPIClient(
  muxgo.NewConfiguration(
    muxgo.WithBasicAuth(os.Getenv("MUX_TOKEN_ID"), os.Getenv("MUX_TOKEN_SECRET")),
  ))

car := muxgo.CreateAssetRequest{PlaybackPolicy: []muxgo.PlaybackPolicy{muxgo.PUBLIC}, VideoQuality: "basic"}
cur := muxgo.CreateUploadRequest{NewAssetSettings: car, Timeout: 3600, CorsOrigin: "*"}
u, err := client.DirectUploadsApi.CreateDirectUpload(cur)

```

```node

import Mux from '@mux/mux-node';
const mux = new Mux({
  tokenId: process.env.MUX_TOKEN_ID,
  tokenSecret: process.env.MUX_TOKEN_SECRET
});

mux.video.uploads.create({
  cors_origin: 'https://your-browser-app.com', 
  new_asset_settings: {
    playback_policy: ['public'],
    video_quality: 'basic'
  }
}).then(upload => {
  // upload.url is what you'll want to return to your client.
});

```

```php

$config = MuxPhp\Configuration::getDefaultConfiguration()
  ->setUsername(getenv('MUX_TOKEN_ID'))
  ->setPassword(getenv('MUX_TOKEN_SECRET'));

$uploadsApi = new MuxPhp\Api\DirectUploadsApi(
    new GuzzleHttp\Client(),
    $config
);

$createAssetRequest = new MuxPhp\Models\CreateAssetRequest(["playback_policy" => [MuxPhp\Models\PlaybackPolicy::_PUBLIC], "video_quality" => "basic"]);
$createUploadRequest = new MuxPhp\Models\CreateUploadRequest(["timeout" => 3600, "new_asset_settings" => $createAssetRequest, "cors_origin" => "https://your-browser-app.com"]);
$upload = $uploadsApi->createDirectUpload($createUploadRequest);

```

```python

import mux_python

configuration = mux_python.Configuration()
configuration.username = os.environ['MUX_TOKEN_ID']
configuration.password = os.environ['MUX_TOKEN_SECRET']

uploads_api = mux_python.DirectUploadsApi(mux_python.ApiClient(configuration))

create_asset_request = mux_python.CreateAssetRequest(playback_policy=[mux_python.PlaybackPolicy.PUBLIC], video_quality="basic")
create_upload_request = mux_python.CreateUploadRequest(timeout=3600, new_asset_settings=create_asset_request, cors_origin="*")
create_upload_response = uploads_api.create_direct_upload(create_upload_request)

```

```ruby

MuxRuby.configure do |config|
  config.username = ENV['MUX_TOKEN_ID']
  config.password = ENV['MUX_TOKEN_SECRET']
end

uploads_api = MuxRuby::DirectUploadsApi.new

create_asset_request = MuxRuby::CreateAssetRequest.new
create_asset_request.playback_policy = [MuxRuby::PlaybackPolicy::PUBLIC]
create_asset_request.video_quality = "basic"
create_upload_request = MuxRuby::CreateUploadRequest.new
create_upload_request.new_asset_settings = create_asset_request
create_upload_request.timeout = 3600
create_upload_request.cors_origin = "https://your-browser-app.com"
upload = uploads_api.create_direct_upload(create_upload_request)

```



### 2. Use the URL to upload in your client

Once you've got an upload object, you'll use the authenticated URL it includes to make a `PUT` request that includes the file in the body. The URL is resumable, which means if it's a *really* large file you can send your file in pieces and pause/resume at will.

```ReactNative

async function uploadVideo () {
  // videoUri here is the local URI to the video file on the device
  // this can be obtained with an ImagePicker library like expo-image-picker
  const imageResponse = await fetch(videoUri)
  const blob = await imageResponse.blob()

  // Create an authenticated Mux URL
  // this request should hit your backend and return a "url" in the
  // response body
  const uploadResponse = await fetch('/backend-api')
  const uploadUrl = (await uploadResponse.json()).url

  try {
    let res = await fetch(uploadUrl, {
      method: 'PUT',
      body: blob,
      headers: { "content-type": blob.type}
    });
    console.log("Upload is complete");
  } catch(error) {
    console.error(error);
  }
};

```

```curl

curl -v -X PUT -T myawesomevideo.mp4 "$URL_FROM_STEP_ONE"

```

```js

import * as UpChunk from '@mux/upchunk';

const upload = UpChunk.createUpload({
  // getUploadUrl is a function that resolves with the upload URL generated
  // on the server-side
  endpoint: getUploadUrl,
  // picker here is a file picker HTML element
  file: picker.files[0],
  chunkSize: 5120, // Uploads the file in ~5mb chunks
});

// subscribe to events
upload.on('error', err => {
  console.error('💥 🙀', err.detail);
});

upload.on('progress', progress => {
  console.log('Uploaded', progress.detail, 'percent of this file.');
});

// subscribe to events
upload.on('success', err => {
  console.log("Wrap it up, we're done here. 👋");
});

```

```node

// assuming you're using ESM
import fs from "fs";
import got from "got";

const uploadUrl = /* Authenticated URL from step 1 */

got.put(uploadUrl, {
  body: fs.createReadStream('/path/to/your/file'),
});

```



If you were following along with these examples, you should find new Assets in the Mux Dashboard with the settings you specified in the original upload create request, but the video you uploaded in the second step!

If the upload doesn't work via cURL, be sure that you've put quotes around the upload URL.

## Using Direct Uploads in your application

The examples above are a great way to upload a one-off file into Mux, but let's talk about how this workflow looks in your actual application. Typically you're going to want to do a few things:

* Authenticate the request that gives the user a signed URL so random people don't start ingesting Assets into your Mux account.
* Save information in your application about the file when the user creates the upload, such as who uploaded it and when, details about the video like title, tags, etc.
* Make sure the Asset that's ultimately created from that upload is associated with that information.

Just like Assets, Direct Uploads have their own events, and then the Asset created off the upload has the usual events as well. When you receive the `video.upload.asset_created` event you'll find an `asset_id` key that you could use in your application to tie the Asset back to the upload, but that gets tricky if your application misses events or they come out of order. To keep things simple, we like to use the `passthrough` key when creating an Asset. Let's look at how the passthrough workflow would work in a real application.

<Callout type="info" title="Upload reliably with our Upload SDKs">
  We provide SDKs for Android, iOS, iPadOS, and web frontend that handle difficult parts of the upload process, such has handling large files and preprocessing video for size and cost. Once your backend has created an authenticated URL for the upload, you you can give it to one of our Upload SDKs to reliably process and upload the the video.

  For more information, check out our upload SDK guides:

  * [Upload directly from an Android app](/docs/guides/upload-video-directly-from-android)
  * [Upload directly from iOS or iPadOS](/docs/guides/upload-video-directly-from-ios-or-ipados)
  * [Upload directly from your Web App](/docs/guides/mux-uploader)
</Callout>

<Callout type="info" title="Next.js React example">
  [with-mux-video](https://github.com/vercel/next.js/tree/canary/examples/with-mux-video) is a full open-source example application that uses direct uploads

  `npx create-next-app --example with-mux-video with-mux-video-app`

  Another open-source example application is [stream.new](https://stream.new). GitHub repo link: [muxinc/stream.new](https://github.com/muxinc/stream.new)

  `git clone git@github.com:muxinc/stream.new.git`

  Both of these example applications use [Next.js](https://nextjs.org/), UpChunk, Mux Direct Uploads and Mux playback.
</Callout>

### Creating an `/upload` route in the application

In the route we build to create and return a new Direct Upload, we'll first create a new object in our application that includes a generated ID and all the additional information we want about that Asset. *Then* we'll create the Direct Upload and include that generated ID in the `passthrough` field.

```node
const { json, send } = require('micro');
const uuid = require('uuid/v1');

// This assumes you have MUX_TOKEN_ID and MUX_TOKEN_SECRET
// environment variables.
const mux = new Mux();

// All the 'db' references here are going to be total pseudocode.
const db = yourDatabase();

module.exports = async (req, res) => {
  const id = uuid();
  // Go ahead and grab any info you want from the request body.
  const assetInfo = await json(req);

  // Create a new upload using the Mux SDK.
  const upload = await mux.video.uploads.create({
    // Set the CORS origin to your application.
    cors_origin: 'https://your-app.com',

    // Specify the settings used to create the new Asset after
    // the upload is complete
    new_asset_settings: {
      passthrough: id,
      playback_policy: ['public'],
      video_quality: 'basic'
    }
  });

  db.put(id, {
    // save the upload ID in case we need to update this based on
    // 'video.upload' webhook events.
    uploadId: upload.id,
    metadata: assetInfo,
    status: 'waiting_for_upload',
  });

   // Now send back that ID and the upload URL so the client can use it!
  send(res, 201, { id, url: upload.url });
}
```

Excellent! Now we've got a working endpoint to create new Mux uploads that we can use in our Node app or deploy as a serverless function. Next we need to make sure we have an endpoint that handles the Mux webhooks when they come back.

```node
const { json, send } = require('micro');

// More db pseudocode.
const db = yourDatabase();

module.exports = async (req, res) => {
  // We'll grab the request body again, this time grabbing the event
  // type and event data so we can easily use it.
  const { type: eventType, data: eventData } = await json(req);

  switch (eventType) {
    case 'video.asset.created': {
      // This means an Asset was successfully created! We'll get
      // the existing item from the DB first, then update it with the
      // new Asset details
      const item = await db.get(eventData.passthrough);
      // Just in case the events got here out of order, make sure the
      // asset isn't already set to ready before blindly updating it!
      if (item.asset.status !== 'ready') {
        await db.put(item.id, {
          ...item,
          asset: eventData,
        });
      }
      break;
    };
    case 'video.asset.ready': {
      // This means an Asset was successfully created! This is the final
      // state of an Asset in this stage of its lifecycle, so we don't need
      // to check anything first.
        const item = await db.get(eventData.passthrough);
      await db.put(item.id, {
        ...item,
        asset: eventData,
        });
      break;
    };
    case 'video.upload.cancelled': {
      // This fires when you decide you want to cancel an upload, so you
      // may want to update your internal state to reflect that it's no longer
      // active.
      const item = await db.findByUploadId(eventData.passthrough);
      await db.put(item.id, { ...item, status: 'cancelled_upload' });
    }
    default:
      // Mux sends webhooks for *lots* of things, but we'll ignore those for now
      console.log('some other event!', eventType, eventData);
  }
}
```

Great! Now we've got our application listening for events from Mux, then updating our DB to reflect the relevant changes. You could also do cool things in the webhook handler like send your customers events via [Server-Sent Events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events) or [WebSockets](https://developer.mozilla.org/en-US/docs/Web/API/WebSockets_API).

## Handle large files with UpChunk

In general, just making a `PUT` request with the file in the body is going to work fine for most client applications and content. When the files start getting a little bigger, you can stretch that by making sure to stream the file from the disk into the request. With a reliable connection, that can take you to gigabytes worth of video, but if that request fails, you or your customer are going to have to start the whole thing over again.

In those scenarios where you have really big files and potentially need to pause/restart a transfer, you can chunk up the file and use the resumable features of the upload endpoint! If you're doing it in a browser we wrote [UpChunk](https://github.com/muxinc/upchunk) to help, but the process isn't nearly as scary as it sounds.

### Installing UpChunk

**With NPM**

```shell
npm install --save @mux/upchunk
```

**With yarn**

```shell
yarn add @mux/upchunk
```

**With CDN**

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/upchunk@2"></script>
```

### Using UpChunk

```js

import * as UpChunk from '@mux/upchunk';

// Pretend you have an HTML page with an input like: <input id="picker" type="file" />
const picker = document.getElementById('picker');

picker.onchange = () => {
  const getUploadUrl = () =>
    fetch('/the-backend-endpoint').then(res => {
      res.ok ? res.text() : throw new Error('Error getting an upload URL :(')
    });

  const upload = UpChunk.createUpload({
    endpoint: getUploadUrl,
    file: picker.files[0], 
    chunkSize: 5120, // Uploads the file in ~5mb chunks
  });

  // subscribe to events
  upload.on('error', err => {
    console.error('💥 🙀', err.detail);
  });
}

```

```react

import React, { useState } from 'react';
import * as UpChunk from '@mux/upchunk';

function Page() {
  const [progress, setProgress] = useState(0);
  const [statusMessage, setStatusMessage] = useState(null);

  const handleUpload = (inputRef) => {
    try {
      const response = await fetch('/your-server-endpoint', { method: 'POST' });
      const url = await response.text();
    
      const upload = UpChunk.createUpload({
        endpoint: url, // Authenticated url
        file: inputRef.files[0], // File object with your video file’s properties
        chunkSize: 5120, // Uploads the file in ~5mb chunks
      });
    
      // Subscribe to events
      upload.on('error', error => {
        setStatusMessage(error.detail);
      });

      upload.on('progress', progress => {
        setProgress(progress.detail);
      });

      upload.on('success', () => {
        setStatusMessage("Wrap it up, we're done here. 👋");
      });
    } catch (error) {
      setErrorMessage(error);
    }
  }

  return (
    <div className="page-container">
      <h1>File upload button</h1>
      <label htmlFor="file-picker">Select a video file:</label>
      <input type="file" onChange={(e) => handleUpload(e.target)}
        id="file-picker" name="file-picker"/ >

      <label htmlFor="upload-progress">Downloading progress:</label>
      <progress value={progress} max="100"/>
      
      <em>{statusMessage}</em>
        
    </div>
  );
}

export default Page;

```



### Alternatives to UpChunk

* Split the file into chunks that are a multiple of 256KB (`256 * 1024` bytes). For example, if you wanted to have 20MB chunks, you'd want each one to be 20,971,520 bytes (`20 * 1024 * 1024`). The exception is the final chunk, which can just be the remainder of the file. Bigger chunks will be a faster upload, but think about each one as its own upload in the sense of needing to restart that one if it fails, but needing to upload fewer chunks can be faster.
* Set a couple of headers:
  * `Content-Length`: the size of the current chunk you're uploading.
  * `Content-Range`: what bytes you're currently uploading. For example, if you've got a 10,000,000 byte file and you're uploading in ~1MB chunks, this header would look like `Content-Range: bytes 0-1048575/10000000` for the first chunk.
* Now use a `PUT` request like we were for "normal" uploads, just with those additional headers and each individual chunk as the body.
* If the server responds with a `308`, you're good to continue uploading! It will respond with as `200 OK` or `201 Created` when the upload is completed.

## Upload streamed data as it becomes available

When dealing with streaming data where the total file size is unknown until the end—such as live recordings or streaming AI generated data—you can upload the data to Mux in chunks as it becomes available.

This approach has several benefits:

* No need to know the total file size upfront
* Reduced memory usage on the client, because you're uploading chunks and releasing them instead of buffering the entire file in memory
* Faster uploads, because you're uploading chunks in parallel with the client instead of waiting for the entire file to be recorded

### Example: MediaRecorder

When recording media directly from a user's device using the [MediaRecorder API](https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder), the total file size is unknown until the recording is complete. To handle this, you can upload the media data to Mux in chunks as it becomes available, without needing to know the total size up front.

Let's look at an example of how to do this with a web app. First, we'll set up the MediaRecorder to capture media data in chunks. Each chunk will be passed to the `uploadChunk` function, which will upload it to Mux.

<Callout type="info" title="Open source example repository">
  You can find a complete example repository demonstrating this approach in our [examples repo](https://github.com/muxinc/examples/tree/main/mediarecorder-streaming-uploads).
</Callout>

Start by declaring some global variables to track recording state and upload progress.

```javascript
// Global variables to track recording state and upload progress
let mediaRecorder;
let mediaStream;
let nextByteStart = 0;
const CHUNK_SIZE = 8 * 1024 * 1024; // 8MB chunks - must be multiple of 256KB
const maxRetries = 3; // Number of upload retry attempts
const lockName = 'uploadLock'; // Used by Web Locks API for sequential uploads
let activeUploads = 0; // Track number of chunks currently uploading
let isFinalizing = false; // Flag to prevent new uploads during finalization
```

Next, configure the MediaRecorder to capture media data in chunks.

```javascript
async function startRecording() {
  // Request access to user's media devices
  mediaStream = await navigator.mediaDevices.getUserMedia({ audio: true, video: true });

  // Use a widely supported MIME type for maximum compatibility
  const mimeType = 'video/webm';

  // Initialize MediaRecorder with optimal settings
  mediaRecorder = new MediaRecorder(mediaStream, {
    mimeType,
    videoBitsPerSecond: 5000000, // 5 Mbps video bitrate
    audioBitsPerSecond: 128000   // 128 kbps audio bitrate
  });

  // Buffer to accumulate media data until we have enough for a chunk
  let buffer = new Blob([], { type: mimeType });
  let bufferSize = 0;

  // Handle incoming media data
  mediaRecorder.ondataavailable = async (event) => {
    // Only process if we have data and aren't in the finalization phase
    if (event.data.size > 0 && !isFinalizing) {
      // Combine the new data with our existing buffer
      // We use a Blob to efficiently handle large binary data
      // The type must match what we specified when creating the MediaRecorder
      buffer = new Blob([buffer, event.data], { type: mimeType });
      bufferSize += event.data.size;

      // Keep processing chunks as long as we have enough data
      // This ensures we maintain a consistent chunk size of 8MB (CHUNK_SIZE)
      // which is required by Mux's direct upload API
      while (bufferSize >= CHUNK_SIZE) {
        // Extract exactly CHUNK_SIZE bytes from the start of our buffer
        const chunk = buffer.slice(0, CHUNK_SIZE);
        
        // Keep the remainder in the buffer for the next chunk
        buffer = buffer.slice(CHUNK_SIZE);
        bufferSize -= CHUNK_SIZE;

        // Upload this chunk, passing false for isFinalChunk since we're still recording
        // nextByteStart tracks where in the overall file this chunk belongs
        await uploadChunk(chunk, nextByteStart, false);
        
        // Increment our position tracker by the size of the chunk we just uploaded
        nextByteStart += chunk.size;
      }
      // Any remaining data stays in the buffer until we get more from ondataavailable
    }
  };

  // Start recording, getting data every 500ms
  mediaRecorder.start(500);
}
```

Uploaded chunks need to be delivered in multiples of 256KB (`256 * 1024` bytes). Since the chunks provided by the `MediaRecorder` API can be smaller than that, you'll need to collect them in a buffer until you have an aggregate chunk that is at least 256KB in size. 8MB is a good size for a chunk, so we'll use that as our chunk size in this example.

When the recording is complete, call the `stopRecording` function to upload the final chunk and clean up the MediaRecorder.

```javascript
async function stopRecording() {
  // Only proceed if we have an active mediaRecorder
  if (mediaRecorder && mediaRecorder.state !== 'inactive') {
    // Stop recording new data
    mediaRecorder.stop();
    // Set flag to prevent new uploads from starting during finalization
    isFinalizing = true;

    // Wait for any in-progress chunk uploads to complete
    // Check every 100ms until all uploads are done
    while (activeUploads > 0) {
      await new Promise(resolve => setTimeout(resolve, 100));
    }

    // If there's any remaining data in the buffer that hasn't been uploaded yet
    // Upload it as the final chunk (isFinalChunk = true)
    if (buffer.size > 0) {
      await uploadChunk(buffer, nextByteStart, true);
      nextByteStart += buffer.size;
    }

    // Clean up by stopping all media tracks (camera, mic etc)
    if (mediaStream) {
      mediaStream.getTracks().forEach(track => track.stop());
    }

    // Reset finalization flag now that we're done
    isFinalizing = false;
  }
}
```

Within the `uploadChunk` function, perform a `PUT` request to the authenticated Mux upload URL. Use the `Content-Range` header to indicate the byte range of the chunk being uploaded. Since the total file size is unknown, use `*` as the total size until the final chunk is uploaded.

```javascript
async function uploadChunk(chunk, byteStart, isFinalChunk) {
  // Calculate the end byte position for this chunk by adding chunk size to start position
  // Subtract 1 since byte ranges are inclusive (e.g. bytes 0-499 is 500 bytes)
  const byteEnd = byteStart + chunk.size - 1;

  // For the total size in the Content-Range header:
  // - If this is the final chunk, use the actual total size (byteEnd + 1)
  // - Otherwise use '*' since we don't know the final size yet
  const totalSize = isFinalChunk ? byteEnd + 1 : '*';

  // Set required headers for resumable upload:
  // - Content-Length: Size of this chunk in bytes
  // - Content-Range: Byte range being uploaded in format "bytes START-END/TOTAL"
  const headers = {
    'Content-Length': chunk.size.toString(),
    'Content-Range': `bytes ${byteStart}-${byteEnd}/${totalSize}`,
  };

  let attempt = 0;
  let success = false;

  // Use Web Locks API to enforce sequential uploads
  await navigator.locks.request(lockName, async () => {
    activeUploads++;
    while (attempt < maxRetries && !success) {
      try {
        const response = await fetch('MUX_DIRECT_UPLOAD_URL_HERE', {
          method: 'PUT',
          headers,
          body: chunk
        });

        if (response.ok || response.status === 308) {
          success = true;
        } else {
          throw new Error(`Upload failed with status: ${response.status}`);
        }
      } catch (error) {
        attempt++;
        if (attempt < maxRetries) {
          await new Promise(resolve => setTimeout(resolve, attempt * 1000)); // Exponential backoff
        } else {
          throw error;
        }
      }
    }
    activeUploads--;
  });

  return success;
}
```

<Callout type="warning" title="Maintain sequential uploads with the Web Locks API">
  In the provided example, the [`navigator.locks.request`](https://developer.mozilla.org/en-US/docs/Web/API/Web_Locks_API) method is used to enforce sequential chunk uploads. This is necessary because if the `MediaRecorder` is stopped, the `ondataavailable` event can trigger multiple times simultaneously, which would cause multiple concurrent uploads if not properly synchronized. If you attempt to upload the final chunk before the previous uploads have completed, the upload will fail.
</Callout>

The final chunk is indicated by the `isFinalChunk` parameter, which is passed to the `uploadChunk` function. When `isFinalChunk` is `true`, the function will upload the remaining data in the buffer as the final chunk and modify the `totalSize` to reflect the total amount of data that was uploaded.


# Upload video directly from an Android app
Allow your users to upload content directly to Mux from a native Android app.
Direct Uploads allow you to provide an authenticated upload URL to your client applications so content can be uploaded directly to Mux without needing any intermediate steps. You still get to control who gets an authenticated URL, how long it's viable, and, of course, the Asset settings used when the upload is complete.

The Android Upload SDK allows a client application to upload video files from an Android device to Mux Video. The upload can be paused before completion and resume where it left off, even after process death.

Let's start by uploading a video directly into Mux from an Android application. The code from these examples can be found in our [upload example app](https://github.com/muxinc/android-upload-sdk/tree/main/app)

## Gradle setup

To integrate the Mux Upload SDK into your Android app, you first have to add it to your project, then create a `MuxUpload` object using an upload URL your app can fetch from a trusted server. Once you create the `MuxUpload`, you can start it with `start()`.

A working example can be found alongside our source code [here](https://github.com/muxinc/android-upload-sdk/tree/main/app).

## Add Mux's maven repository to your project

Add our maven repository to your project's `repositories` block. Depending on your setup, you may need to do this either the `settings.gradle` under `dependencyResolutionManagement` or your project's `build.gradle`.

```gradle\_kts

// In your repositories block
maven {
url = uri("https://muxinc.jfrog.io/artifactory/default-maven-release-local")
}
  
```

```gradle\_groovy

// In your repositories block
maven {
url "https://muxinc.jfrog.io/artifactory/default-maven-release-local"
}
  
```



## Add the Upload SDK to your app's dependencies

Add the upload SDK to your app's `dependencies` block in its `build.gradle` file.

```gradle\_kts

// in your app's dependencies
implementation("com.mux.video:upload:0.4.1")
  
```

```gradle\_groovy

// in your app's dependencies
implementation "com.mux.video:upload:0.4.1"
  
```



## Upload a video

In order to securely upload a video, you will need to create a PUT URL for your video. Once you have created the upload URL, return it to the Android client then use the `MuxUpload` class to upload the file to Mux.

## Getting an Upload URL to Mux Video

In order to upload a new video to Mux, you must first create a new <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">Direct Upload</ApiRefLink> to receive the file. The Direct Upload will contain a resumable PUT url for your Android client to use while uploading the video file.

You should not create your Direct Uploads directly from your app. Instead, refer to the [Direct Upload Guide](/docs/guides/upload-files-directly) to create them securely on your server backend.

## Creating and starting your `MuxUpload`

To perform the upload from your Android app, you can use the `MuxUpload` class. At the simplest, you need to build your `MuxUpload` via its `Builder`, then add your listeners and `start()` the upload.

```kotlin

/**
* @param myUploadUri PUT URL fetched from a trusted environment
* @param myVideoFile File where the local video is stored. The app must have permission to read this file
*/
fun beginUpload(myUploadUrl: Uri, myVideoFile: File) {
val upl = MuxUpload.Builder(myUploadUrl, myVideoFile).build()
upl.addProgressListener { innerUploads.postValue(uploadList) }
upl.addResultListener {
  if (it.isSuccess) {
    notifyUploadSuccess()
  } else {
    notifyUploadFail()
  }
}
upl.start()
}
  
```

```java

/**
* @param myUploadUri PUT URL fetched from a trusted environment
* @param myVideoFile File where the local video is stored. The app must have permission to read this file
*/
public void beginUpload(Uri uploadUrl, File videoFile) {
MuxUpload upload = new MuxUpload.Builder(uploadUri, videoFile).build();
upload.setProgressListener(progress -> {
  handleProgress(progress);
});
upload.setResultListener(result -> {
  if (UploadResult.isSuccessful(progressResult)) {
    handleSuccess(UploadResult.getFinalProgress(progressResult));
  } else {
    handleFailure(UploadResult.getError(progressResult));
  }
});
upload.start();
}
  
```



### Resume uploads after network loss or process death

The upload SDK will keep track of uploads that are in progress. When your app starts, you can restart them using `MuxUploadManager`. For more information on managing, pausing, and resuming uploads, see the next section of this guide.

```kotlin

// You can do this anywhere, but it's really effective to do early in app startup
MuxUploadManager.resumeAllCachedJobs()
  
```

```java

// You can do this anywhere, but it's really effective to do early in app startup
MuxUploadManager.INSTANCE.resumeAllCachedJobs()
  
```



### Upload from a coroutine

If you're using Kotlin coroutines, you don't have to rely on the listener API to receive notifications when an upload succeeds or fails. If you prefer, you can use `awaitSuccess` in your coroutine.

```kotlin
suspend fun uploadFromCoroutine(videoFile: File): Result<UploadStatus> {
  val uploadUrl = withContext(Dispatchers.IO) {
    getUploadUrl()  // via call to your backend server, see the guide above
  }
  val upload = MuxUpload.Builder(uploadUrl, videoFile).build()
  // Set up your listener here too
  return upload.awaitSuccess()
}
```

## Resuming and managing Uploads

`MuxUpload`s are managed globally while they are in-progress. Your upload can safely continue while your user does other things in your app. Optionally, you can listen for progress updates for these uploads in, eg, a foreground `Service` with a system notification, or a progress view in another `Fragment`.

### Find Uploads already in progress

`MuxUpload`s are managed internally by the SDK, and you don't have to hold onto a reference to your `MuxUpload` in order for the upload to complete. You can get a `MuxUpload` object for any file currently uploading using `MuxUploadManager`

This example listens for progress updates. You can also `pause()` or `cancel()` your uploads this way if desired.

```kotlin

fun listenToUploadInProgress(videoFile: File) {
val upload = MuxUploadManager.findUploadByFile(videoFile)
upload?.setProgressListener { handleProgress(it) }
}
  
```

```java

public void listenToUploadInProgress(File videoFile) {
MuxUpload uploadInProgress = MuxUploadManager.INSTANCE.findUploadByFile(videoFile);
if (uploadInProgress != null) {
  uploadInProgress.setProgressListener(progress -> handleProgress(progress));
}
}
  
```



## Advanced

### Setting a Maximum resolution

If desired, you may choose a maximum resolution for the content being uploaded. You may wish to scale down the video files that are too large for your asset tier, for instance. This can save data costs for your users and it ensures that your assets are available to play as soon as possible.

The Mux Upload SDK scales down any input video larger than 4k (3840x2160 or 2160x3840) by default. You can choose to scale them down further to save on user data, or if you're targeting a basic video quality asset.

### Disable Input Standardization

<Callout type="warning">
  The setting described here will only affect *local* changes to your input. Mux Video will still convert any non-standard inputs to a standard format during ingestion.
</Callout>

The Upload SDK is capable of processing input videos in order to optimize them for use with Mux Video. This behavior can be disabled if it isn't desired, although this may result in extra processing on Mux's servers. We don't recommend disabling standardization unless you are experiencing issues.

```kotlin

fun beginUpload(myUploadUrl: Uri, myVideoFile: File) {
val upl = MuxUpload.Builder(myUploadUrl, myVideoFile)
  .standarizationRequested(false) // disable input processing
  .build()
// add listeners etc
upl.start()
}
  
```

```java

public void beginUpload(Uri uploadUrl, File videoFile) {
MuxUpload upload = new MuxUpload.Builder(uploadUri, videoFile)
    .standarizationRequested(false) // disable input processing
    .build();
// add listeners etc
upload.start();
}
  
```



## Release notes

### Current release

### 1.0.4

Improvements:

* fix: crash on app resume when system clears cached upload files

### Previous releases

### 1.0.3

Updates:

* Update libyuv for 16KB page size

### 1.0.2

Improvements:

* fix: crashing while handling exceptions in some cases
* fix: input standardization reporting metrics when not when opted-in

### 1.0.1

* fix: possible bad behavior if the hosting app doesn't have a version set
* chore: update to kotlin 2.0.10 + others

### 1.0.0

New:

* 4k and 720p input standardization
* Background-Uploading example in sample app

Fixes:

* fix: currentStatus always reports READY
* fix: upload success reported as failure in some cases

### 0.5.0

New:

* Audio transcoding

Improvements:

* Improved performance reporting

### 0.4.2

New

* feat: Add `UploadResult` class for java users to interpret Result

Fixes

* Fix: Some methods on MuxUpload.Builder don't return Builder
* Fix: the application context shouldn't be visible
* Fix: Transcoding errors handled incorrectly

### 0.4.1

New

* feat: Add API for listening to and retrieving the status of an upload
* feat: Add Input Standardization, to process videos device-side

Fixes

* fix: Metrics events not properly redirected

Known Issues

* There's no notification/callback for the result of input standardization

### 0.4.0

Improvements

* doc: Finish out the public KDoc
* doc: Improve KDocs and add logo
* nfc: Hide some internal classes from java. This should not affect anyone using the SDK

Fixes

* fix: pausing uploads shouldn't create errors

### 0.3.1

Improvements

* fix: Restarts broken due to invalid progress data

### 0.3.0

Breaking

* breaking: Remove extraneous MIME type and Retry Time Builder fields. Configuring these did nothing

Updates

* update: Add the ability to resume all failed or paused uploads

Improvements

* Greatly improved the example app

### 0.2.0

Improvements

* fix: Hide some constructors to prevent unintended use by java callers
* feat: Add Performance Metrics Tracking

### 0.1.0

🥳 First beta release of the Mux Android SDK

Features

* Upload multiple files at once
* Pause and resume uploads, even after process death
* Observe upload progress from anywhere in your app without extra code


# Upload video directly from iOS or iPadOS
Allow your users to upload video to Mux from an iOS or iPadOS application with Direct Uploads and the Upload SDK.
[Direct Uploads](/docs/guides/upload-files-directly) allow you to upload content from your client applications directly to Mux without needing any intermediary steps using an authenticated URL.

This guide will help you install the Upload SDK from Mux. The Upload SDK is designed to handle common tasks required to upload large video files, like file chunking and networking. By using the Upload SDK, your application will also become able to pause and resume uploads across restarts, report upload progress, and make adjustments that minimize processing time when your upload is ingested by Mux.

The Upload SDK is supported on iOS 14 and iPadOS 14, or higher. macOS is not supported at this time.

Your application can also handle uploads [on its own](/docs/guides/upload-files-directly#if-you-dont-want-to-use-upchunk) using built-in [`URLSession`](https://developer.apple.com/documentation/foundation/urlsession) and [file system](https://developer.apple.com/documentation/foundation/file_system) APIs. We encourage you to check out the Upload SDK [implementation](https://github.com/muxinc/swift-upload-sdk) as an example to follow along.

## Install the SDK

Let's start by installing the SDK. We'll use the Swift Package Manager. [Step-by-step guide on using Swift Package Manager in Xcode](https://developer.apple.com/documentation/xcode/adding-package-dependencies-to-your-app).

Open your applications project in Xcode. In the Xcode menu bar select File > Add Packages. In the top-right corner of the modal window that opens enter the SDK repository URL which is `https://github.com/muxinc/swift-upload-sdk`.

By default Xcode will fetch the latest version of the SDK available on the `main` branch. If you need a specific package version or to restrict the range of package versions used in your application, select a different Dependency Rule. [Here's an overview of the different SPM Dependency Rules and their semantics](https://developer.apple.com/documentation/xcode/adding-package-dependencies-to-your-app#Decide-on-package-requirements).

Click on Add Package to begin resolving and downloading the SDK package. When completed, select your application target as the destination for the `MuxUploadSDK` package product. To use the SDK in your application, import it's module: `import MuxUploadSDK`.

## Upload content from your application

## Getting an authenticated URL from Mux Video

<Callout type="info">
  You must create a new [Direct Upload](/docs/guides/upload-files-directly#1-create-an-authenticated-mux-url) to upload a new video to Mux.

  The Direct Upload will contain an authenticated `PUT` url that's unique to your upload. Your application will upload video to this url.
</Callout>

Direct Uploads are resumable and if your application application started an upload and needed to pause it, use the same url to resume the upload.

We recommend that you avoid creating Direct Uploads outside of a trusted environment such as a backend server. Your application can request a new authenticated URL from your server when it needs one. You can also hardcode a pre-made URL in an internal build of your application for a one-time test.

## Create and start your direct upload

Once your application has an authenticated direct upload URL, you're ready to start uploading!

Your application will use the authenticated url to construct a `PUT` request. The body of the request will contain your video data. The `DirectUpload` API in the Upload SDK handles these operations for you.

Initialize a `DirectUpload` with your authenticated URL and a local video file URL. We'll also set the progress handler callback to log the upload progress to the console. In a later example you'll learn how to customize how an upload behaves.

```swift

  import MuxUploadSDK

  // The url found in the response after creating a direct upload
  let authenticatedURL: URL = /* fetch from trusted environment */

  // In this example we're uploading a video input file saved locally inside the application sandbox
  let videoInputURL: URL = /* URL to a video available locally */

  let directUpload = DirectUpload(
    uploadURL: authenticatedURL,
    inputFileURL: videoInputURL
  )

  // Let's log the progress to the console
  directUpload.progressHandler = { state in
    print("Uploaded (state.progress.completedUnitCount) / (state.progress.totalUnitCount)")
  }

  // Then start the direct upload
  directUpload.start()
  
```



## Tactics for handling large files

### Chunking uploads

Smaller videos can be uploaded with a single request. We recommend breaking up larger videos into chunks and treating them as separate uploads.

The Upload SDK handles the required networking and file chunking operations for you regardless of file size. By default the SDK splits your video into *8MB* chunks when necessary. To change the chunk size your application will initialize its own `DirectUploadOptions` and pass the custom size as `chunkSizeInBytes`. Be sure to convert any quantities expressed in kilobytes or megabytes to bytes first.

Initialize a `DirectUpload` and pass the custom options you've created as the `options` parameter. A default set of options will be used if `options` isn't set when initializing a `DirectUpload`.

```swift

  import MuxUploadSDK

  let authenticatedURL: URL = /* fetch from trusted environment */
  let videoInputURL: URL = /* URL to a video available locally */

  // Construct custom upload options to upload a file in 6MB chunks
  let chunkSizeInBytes = 6 * 1024 * 1024
  let options = DirectUploadOptions(
    chunkSizeInBytes: chunkSizeInBytes
  )

  // Initialize a DirectUpload with custom options
  let directUpload = DirectUpload(
    uploadURL: authenticatedURL,
    inputFileURL: videoInputURL,
    options: options
  )

  // Let's log the upload progress to the console
  directUpload.progressHandler = { state in
    print("Uploaded (state.progress.completedUnitCount) / (state.progress.totalUnitCount)")
  }

  // Then start the direct upload
  directUpload.start()
  
```



Smaller chunk sizes result in more requests while larger chunk sizes lead to fewer requests that take longer to complete. We recommend using a smaller chunk size on unstable or lossy networks.

### What happens if an upload request fails?

When the SDK becomes aware of a failed upload `PUT` request, it will automatically retry it. By default the SDK will retry uploading each chunk up to 3 times before the upload is deemed to have failed. This limit can be altered by your application by initializing its own `DirectUploadOptions` with a custom value for `retryLimitPerChunk`. Then initialize a `DirectUpload` with the custom options as the `option` argument.

```swift

  import MuxUploadSDK

  let authenticatedURL: URL = /* fetch from trusted environment */
  let videoInputURL: URL = /* URL to a video available locally */

  // Construct custom upload options with a higher per-chunk retry limit
  let options = DirectUploadOptions(
    retryLimitPerChunk: 5
  )

  // Initialize a DirectUpload that will retry each chunk
  // request up to 5 times
  let directUpload = DirectUpload(
    uploadURL: authenticatedURL,
    inputFileURL: videoInputURL,
    options: options
  )

  // Then start the direct upload
  directUpload.start()
  
```



### Pause and resume uploads

Your application might become suspended or terminated in the middle of a long-running upload. You can avoid losing the progress completed so far by pausing the upload and resuming it when the app becomes active again.

```swift

  import MuxUploadSDK

  class UploadCoordinator {
    func handleApplicationWillTerminate() {
      UploadManager.shared.allManagedUploads().forEach { upload in
        upload.pause()
      }
    }

    func handleApplicationDidBecomeActive() {
      UploadManager.shared.resumeAllUploads()
    }
  }
  
```



A direct upload can be resumed as long as it remains in a `waiting` status and hasn't yet transitioned to a `timed_out` status. You can customize this length of time by setting the `timeout` value in the <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">create direct upload request</ApiRefLink> to a value between 1 minute and 7 days. If no value is set the upload times out 1 hour after being *created*.

## Need a playable asset as fast as possible?

<Callout type="info" title="Beta Functionality">
  The APIs around this feature are not final.
</Callout>

After your direct upload is completed, Mux Video will convert the uploaded input into a playable asset.

Some types of inputs require additional processing time during ingestion before becoming ready for playback. By default the Upload SDK reduces the processing time by adjusting upload inputs locally to a faster-to-process format when needed. More details on how audio and video input formats relate to new asset processing time [available here](/docs/guides/minimize-processing-time).

### Setting a maximum resolution

The SDK can adjust the resolution of your video input locally before it is uploaded to Mux. By default the SDK will adjust the input resolution to 1920 x 1080 for any inputs that are larger.

You can also reduce the maximum resolution further to 1280 x 720. Initialize a new `DirectUploadOptions` and set `.preset1280x720` as `InputStandardization.maximumResolution`.

```swift

  import MuxUploadSDK

  let authenticatedURL: URL = /* fetch from trusted environment */
  let videoInputURL: URL = /* URL to a video available locally */

  // Reduce the maximum resolution to 1280 x 720
  let options = DirectUploadOptions(
    inputStandardization: .init(maximumResolution: .preset1280x720)
  )

  // Initialize a DirectUpload with custom options
  let directUpload = DirectUpload(
    uploadURL: authenticatedURL,
    inputFileURL: videoInputURL,
    options: options
  )

  // Then start the direct upload
  directUpload.start()
  
```



### Skipping input adjustments

<Callout type="warning">
  The setting described here will only affect *local* changes to your input. Mux Video will still convert any non-standard inputs to a standard format during ingestion.
</Callout>

In most cases your application won't need to bypass these adjustments. When necessary they can be skipped by initializing `DirectUploadOptions` and passing `.skipped` for `inputStandardization`, then passing those to the `options` argument when initializing a new `DirectUpload` like you've customized other options before.

```swift

  import MuxUploadSDK

  let authenticatedURL: URL = /* fetch from trusted environment */
  let videoInputURL: URL = /* URL to a video available locally */

  // Skip adjustments to your input locally
  let options = DirectUploadOptions(
    inputStandardization: .skipped
  )

  // Initialize a DirectUpload with that skips input standardization
  // and uploads your video as-is
  let directUpload = DirectUpload(
    uploadURL: authenticatedURL,
    inputFileURL: videoInputURL,
    options: options
  )

  // Then start the direct upload
  directUpload.start()
  
```



## Release notes

### Current release

### 1.0.0

Improvements

* Direct uploads are cancelable while inputs are standardized on the client
* Video inputs can be standardized to 2160p (4K) resolution
* Upload source `AVAsset` has the correct URL

Known Issues

* When checking if a video input file is standard or not, the SDK compares an averaged bitrate to a resolution-dependent limit. If different parts of the video input have varying input, the video may require further processing by Mux upon ingestion.

### 0.7.0

New

* Add macOS deployment target

Improvements

* Fix memory leak occurring when uploading large files

### 0.6.0

New

* Add Foundation Measurement API for chunk size

Breaking

* Rename Version to SemanticVersion for explicitness in API

Improvements

* Remove UIKit dependency from SDK
* Backfill missing inline API docs

### 0.5.0

New

* Add an overload initializer for `DirectUploadOptions`

Breaking

* Remove prefix use in public APIs and respell Upload as DirectUpload

### Previous releases

### 0.4.0

API Changes

* Deprecation: `MuxUpload.init(uploadURL:videoFileURL:chunkSize:retriesPerChunk:)` has been deprecated and will be removed in a future SDK version. Use `init(uploadURL:inputFileURL:options:)` instead
* Breaking Change: `MuxUpload.startTime` now returns an optional value
* Breaking Change: `MuxUpload.Status` has been renamed to `MuxUpload.TransportStatus`
* Add: `UploadOptions` struct to contain all available `MuxUpload` options
* Add: Options to request or to skip input standardization
* Add: `MuxUpload` initializer APIs that accept `AVAsset` or `PHAsset`
* Add: `MuxUpload.InputStatus` enum to represent the current state of the upload and change handler

New

* Support for on-device input standardization, create a playable asset from your direct upload faster. When input standardization is requested from the SDK, input video is converted to a standard range of values on a best-effort basis

Fixes

* Prevent integer overflow when calculating chunk request content ranges
* Prevent crash from chunk worker force unwrap
* Remove public methods from internal SDK classes
* Prevent removal of result handler properties when passing MuxUpload via UploadManager

### 0.3.0

API Changes

* `MuxUpload`'s initializer no longer requires a MIME type or Retry Time. These are calculated internally
* Added methods for querying the `UploadManager` for the list of currenty-active uploads, and listening for changes to the list
* Add opt-out for upload statistics

Improvements

* Add a much-improved example app

### 0.2.1

Improvements

* Track upload statistics

Fixes

* Resumed Uploads start at the beginning of the file

### 0.2.0

Improvements

* Remove Alamofire Dependency

### 0.1.0

Our first release of Mux's Swift Upload SDK!! 🎉 💯

This public beta release includes chunked, pause-able, resume-able video uploads for Mux Video. You can upload from anywhere in your app as well as query the upload state from anywhere in your app regardless of your app architecture. Uploads can be resumed even after your app restarted after a shutdown.


# Minimize processing time
Learn how to optimize your video files for the fastest processing time.
Mux Video accepts most modern video formats and codecs. However, certain types of inputs need to be *standardized* in order for Mux to do further operations on them, and this can add time before the video is ready to be streamed. If you want to standardize your content before sending it to Mux, and potentially improve performance, this guide will show what you need to do.

## Standard input specs

To be considered standard input, the input video file must meet the following requirements:

* **H.264 or HEVC video codecs**. H.264 is the dominant video codec in use today and almost every device supports H.264. HEVC is a more modern codec that's increasingly popular, though not as universally supported. While Mux accepts other codecs as input, other codecs are considered non-standard and will be standardized automatically to H.264.
* **Closed GOP** (group-of-pictures). (Warning: video jargon ahead. You can likely ignore this) In video files encoded with a closed-GOP, all B frames reference other frames in the same GOP. Closed GOPs always begins with an IDR (Instantaneous Decoder Refresh) frame. This means that every GOP can be played independently, without reference to another GOP. Standard inputs must be encoded with a closed-GOP.
* **8-bit 4:2:0, or 10-bit 4:2:0 if HEVC**. This refers to the color depth and chroma subsampling. If you don't know what this is, you can probably ignore this, since most streaming video is 8-bit 4:2:0. HDR usually uses 10-bit 4:2:0, which is only supported as standard when using the HEVC video codec. However, SDR is preferred as Mux does not provide full HDR support.
* **Simple Edit Decision Lists**. Edit Decision Lists (EDLs) are typically added during post-production and define how certain segments are used to build a track timeline for playback. A good example of a Simple Edit Decision List is to fix out of order frames in the video. Input files with more complex uses of EDLs are considered non-standard.
* **AAC audio codec**. AAC is the dominant audio codec in use today and almost every device supports this audio codec. Mono, stereo, and 5.1 AAC are all considered standard formats. While Mux accepts files that use other audio codecs, Mux only delivers AAC audio and non-AAC audio inputs are considered non-standard.

<Callout type="warning" title="5.1 AAC Standard Rollout">
  Support for 5.1 AAC as a standard format is being gradually rolled out. During the rollout period, some 5.1 AAC inputs may still be processed as a non-standard input.
</Callout>

### Additional requirements for assets with a `max_resolution_tier` of 1080p

Assets ingested up to 1080p are subject to the following standard input requirements.

* **1080p/2K or smaller**. Files with a resolution of up to 2048x2048 are considered standard. Files with a larger than this are considered non-standard, unless ingested with a higher `max_resolution_tier`.
* **Max 20-second keyframe interval (10 seconds for HEVC)**. To stream well using HTTP-based streaming methods like HLS, Mux requires all keyframes intervals to be less than 20 seconds, or less than 10 seconds if encoded with HEVC.
* **8Mbps or below**. While Mux accepts higher bitrate inputs, average bitrates higher than 8Mbps are generally challenging for most viewers' connections and are considered non-standard. The bitrate should not exceed 16Mbps for any single GOP.
* **Frame rate between 5 and 120**. Inputs with average frames per second (fps) less than 5 or greater than 120 is considered non-standard. Frame rates within this range will be preserved (e.g. 60 fps will remain 60 fps). Inputs with less than 5 fps or greater than 120 fps will be automatically standardized to 30 fps.

### Additional requirements for assets with a `max_resolution_tier` of 2160p (4K)

[Assets ingested at 2K and 4K resolutions](/docs/guides/stream-videos-in-4k) are subject to the following standard input requirements.

* **2160p or smaller**. The input file must not have any dimension (width, height, or both) that exceeds 4096 pixels.
* **Max 10-second keyframe interval (6 seconds for HEVC)**. To stream 4k video well, a 10 second keyframe interval is required. If using the HEVC video CODEC, this is further limited to 6 seconds.
* **20Mbps or below**. While Mux accepts higher bitrate inputs, bitrates higher than 20Mbps are generally challenging for most viewers' connections.
* **Frame rate between 5 and 60**. For 4k videos, a frame rate above 60fps is considered non-standard.

## How to create standard input (ffmpeg)

As a starting point, here is a sample ffmpeg command for creating video that complies with Mux standard input. Feel free to modify this by using things like 2-pass encoding, different presets, or different bitrates (as long as the total bitrate ends up below than 8Mbps).

```shell
ffmpeg -i input.mp4 -c:a copy -vf "scale=w=min(iw\,1920):h=-2" -c:v libx264 \
-profile high -b:v 7000k -g 239 -pix_fmt yuv420p -maxrate 16000k -bufsize 24000k out.mp4
```

### Standard input for 4K

If you are creating a 4K video, the resolution and bitrate limits are higher. Here is a sample ffmpeg command for creating video that complies with Mux standard input for 4K.

```shell
ffmpeg -i input.mp4 -c:a copy -vf "scale=w=min(iw\,4096):h=-2" -c:v libx264 \
-profile high -b:v 18000k -g 239 -pix_fmt yuv420p -maxrate 36000k -bufsize 54000k out.mp4
```

## How to create standard input on mobile devices

<Callout type="info" title="Mux mobile upload SDKs">
  Mux's [iOS](https://www.mux.com/docs/guides/upload-video-directly-from-ios-or-ipados) and [Android](https://www.mux.com/docs/guides/upload-video-directly-from-android) upload SDKs are optimised to pre-process video files created on mobile devices to create standard input video files before uploading to Mux.
</Callout>

Many mobile devices capture H.264 8-bit 4:2:0 video by default. More recent mobile devices increasing use HEVC 4:2:0 by default (either 8-bit with HDR disabled, or 10-bit with HDR enabled). Here are the main things to watch out for:

* Ensure that the total file bitrate is below 8 mbps.
* Prefer recording video in SDR (standard dynamic range). Some newer devices capture video in HDR (High Dynamic Range), which requires 10-bit 4:2:0 color. This is also acceptable, but may be more likely to exceed the limited bitrate to be considered standard input. If you have the choice, you should record SDR.
* Ensure the output file is smaller than 1080p (1920x1080) or 2K (2048x1152). Some cameras shoot 4K video, which by default is converted down to 1080p when using Mux Video. If you want to use 4K video, ensure you're enabling that in your API calls to Mux.
* If possible, choose a keyframe interval of 5s or so, but certainly between 1 and 10 seconds, and enable closed-GOP encoding. (If you don't see these options in your app or camera, it's probably the default already.)

## Non-standard input

Mux Video works fine with video outside of the standard input specs. But because other videos cannot be easily streamed to many modern devices, Mux Video must perform an initial encoding operation on non-standard input to create a mezzanine file. This means that non-standard input will be slower to ingest.

As soon as Mux Video detects that the input file is non-standard, it emits the `video.asset.non_standard_input_detected`. This lets you know right away that your video will need additional processing time, along with the specific reasons why it's considered non-standard.

```json
{
  "type": "video.asset.non_standard_input_detected",
  "data": {
    "id": "{ASSET_ID}",
    "status": "preparing",
    "non_standard_input_reasons": {
      "video_gop_size": "high"
    }
    // ... other asset fields
  }
}
```

*Note that `non_standard_input_reasons` may not be finalized as additional reasons maybe found later and will be included on the asset after ingest completion.*

Mux Video also exposes this information in all subsequent asset webhooks, including the `video.asset.ready` event. This information is also returned in the asset object when retrieved using the <ApiRefLink href="/docs/api-reference/video/assets/get-asset">Get Asset API</ApiRefLink>.

### Understanding the transcoding progress of a non-standard input

When an asset needs additional processing, you can use the `progress` field from the <ApiRefLink href="/docs/api-reference/video/assets/get-asset">Get Asset API</ApiRefLink> response, which returns the current `state` of and, the completion percentage of a transcode.

```json
// GET /video/v1/assets/{ASSET_ID}
{
  "id": "{ASSET_ID}",
  "status": "preparing",
  "non_standard_input_reasons": {
    "video_gop_size": "high"
  },
  "progress": {
    "state": "transcoding", 
    "progress": 23.02
  }
  // ... other asset fields
}
```

This field tells you both the current processing state and an estimated completion percentage from `0` to `100`, allowing you to keep your users informed with accurate progress indicators.

The progress field can have the following `state`s:

* `transcoding`: Asset is undergoing non-standard transcoding. `progress` will be between `0` and `100`.
* `ingesting`: Asset is being ingested (initial processing before or after transcoding). Progress will be `0`.
* `errored`: Asset has encountered an error (`status` is `errored`). Progress will be `-1`.
* `completed`: Asset processing is complete (`status` is `ready`).  Progress will be `100`.
* `live`: Asset is a live stream currently in progress.  Progress will be `-1`.

## General limits

The max duration for any single asset is 12 hours.


# Control recording resolution
If the video being captured in your app doesn't need to be played back in full resolution, specify a lower resolution when recording to take advantage of Mux's resolution dependent pricing.
## Android

The way you control the resolution of a recorded video depends on the API used to record or encode it. All of Google's major camera and recording APIs have a method for setting either the exact or maximum resolution of the videos they create.

### CameraX

With the CameraX library provide a `QualitySelector` that doesn't allow for resolutions beyond 720p (1280x720).

```kotlin
// Selects only Standard HD (720p) and Standard Definition (480p)
val selector = QualitySelector.fromOrderedList(
  listOf(Quality.HD, Quality.SD),
  FallbackStrategy.lowerQualityOrHigherThan(Quality.SD)
 )

val recorder = Recorder.Builder()
  .setQualitySelector(selector)
  ...
  .build()
```

### MediaCodec

If you are encoding video yourself via the `MediaCodec` API, you can set the encoder's output resolution by setting it in the input `MediaFormat`. For more information on how to configure and use `MediaCodec`, [try the docs](https://developer.android.com/reference/android/media/MediaCodec)

```kotlin
val mediaCodec = MediaCodec.createByCodecName(codecName)
val encodeFormat = MediaFormat().apply {
  setInteger(MediaFormat.KEY_FRAME_RATE, myExampleFrameRate)
  //... Other required params
  // Output 720p
  setInteger(MediaFormat.KEY_HEIGHT, 720)
  setInteger(MediaFormat.KEY_WIDTH, 1280)
}
mediaCodec.configure(
  encodeFormat,
  myInputSurface,
  null,
  MediaCodec.CONFIGURE_FLAG_ENCODE
)
```

### Camera2

Camera2 doesn't have an API to set the video resolution directly, but it infers it from the input surface. You have to call `SurfaceHolder.setFixedSize()` on your capture requests' targets. This can only be done on Lollipop/API 21 or higher. Please refer to [the camera2 docs](https://developer.android.com/reference/android/hardware/camera2/CameraDevice#createCaptureSession\(android.hardware.camera2.params.SessionConfiguration\)) for more information

```kotlin
val supportedCameraResolutions = streamConfigMap.getOutputSizes(ImageFormat.NV21)
val size =
  supportedCameraResolutions.toList().sortedBy { it.height }.findLast { it.height <= 720 && it.width <= 1280 }
size?.let { cameraSurfaceHolder.setFixedSize(it.width, it.height) }
cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD)
  .apply { addTarget(cameraSurfaceHolder.surface) }
  // ...
  .build()
cameraDevice.createCaptureSession(...)
```

### MediaRecord

MediaRecord's output size can be configured by calling `MediaRecord.setVideoSize()` before calling `prepare()`.

```kotlin
mediaRecord.setVideoSize(1280, 720)
mediaRecord.prepare()
```

## iOS and iPadOS

This guide covers setting maximum video resolution when recording video on iOS and iPadOS. The directions and code examples on this page assume you are using AVFoundation to setup and configure your camera. If you’ve never used AVFoundation before we recommend you brush up on the basics before proceeding further, see the official Apple [documentation](https://developer.apple.com/documentation/avfoundation) for a quick introduction and sample code.

Video recording on iOS is managed using [AVCaptureSession](https://developer.apple.com/documentation/avfoundation/avcapturesession). The resolution for video output from AVCaptureSession can be configured using a settings preset and this example shows how to configure VCaptureSession to output video at a resolution of 720p (1280 x 720 pixels).

```swift
let session = fetchYourCaptureSession()
session.beginConfiguration()

let updatedSessionPreset = AVCaptureSession.hd1280x720
if session.canSetSessionPreset(updatedSessionPreset) {
    session.sessionPreset = updatedSessionPreset
}

session.commitConfiguration()
```

Don’t forget to call `beginConfiguration()` before applying any configuration changes. When all the configuration changes have been applied, make sure your implementation calls `commitConfiguration()`.

It is best for any work that is done in-between calls to `beginConfiguration()` and `commitConfiguration()` to be synchronous. If you need to perform any asynchronous tasks, such as fetching the preferred resolution from your backend, make sure those are complete before you begin to configure `AVCaptureSession`.

## OBS

Streams initiated via [OBS](https://obsproject.com/) can be configured in Settings > Video > Output (Scaled) Resolution.

<Image src="/docs/images/output-obs.png" width="988" height="761" />


# Play your videos
In this guide you will learn how to play Mux videos in your application.
## 1. Get your playback ID

Each `asset` and each `live_stream` in Mux can have one or more **Playback IDs**.

This is an example of the `"playback_ids"` from the body of your `asset` or `live_stream` in Mux. In this example, the `PLAYBACK_ID` is `"uNbxnGLKJ00yfbijDO8COxTOyVKT01xpxW"` and the `policy` is `"public"`.

<Callout type="warning">
  **Playback IDs** can have a policy of `"public"` or `"signed"`. For the purposes of this guide we will be working with `"public"` playback IDs.

  If this is your first time using Mux, start out with `"public"` playback IDs and then read more about [securing video playback with signed URLs](/docs/guides/secure-video-playback) later.
</Callout>

```json
"playback_ids": [
  {
    "policy": "public",
    "id": "uNbxnGLKJ00yfbijDO8COxTOyVKT01xpxW"
  }
],
```

## 2. Create an HLS URL

HLS is a standard protocol for streaming video over the internet. Most of the videos you watch on the internet, both live video and on-demand video is delivered over HLS. Mux delivers your videos in this standard format.

Because HLS is an industry standard, you are free to use any HLS player of your choice when working with Mux Video.

HLS URLs end with the extension `.m3u8`. Use your `PLAYBACK_ID` to create an HLS URL like this:

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8
```

<Callout type="info">
  If you're curious to learn more about how HLS works you might find this informational site [howvideo.works](https://howvideo.works) makes for some good bedtime reading.
</Callout>

## Other formats

HLS (`.m3u8`) is used for streaming assets (video on demand) and live streams. For offline viewing and post-production editing take a look at the guide for [download your videos](/docs/guides/download-your-videos) which covers `mp4` formats and master access.

## 3. Use the HLS URL in a player

Most browsers do not support HLS natively in the `video` element (Safari and IE edge are exceptions). Some JavaScript will be needed in order to support HLS playback in your web application.

The default player in iOS and TVOS (AVPlayer) supports HLS natively, so no extra effort is needed. In the Swift example below we're using the [VideoPlayer](https://developer.apple.com/documentation/avkit/videoplayer) struct that comes with SwiftUI and AVKit.

Similarly, the default player ExoPlayer on Android also supports HLS natively.

<Callout type="info" title="Next.js React example">
  If you're using [Next.js](https://nextjs.org/) or React for your application, the [with-mux-video example](https://github.com/vercel/next.js/tree/canary/examples/with-mux-video) is a good place to start.

  `npx create-next-app --example with-mux-video with-mux-video-app`
</Callout>

```android

implementation 'com.google.android.exoplayer:exoplayer-hls:2.X.X'

// Create a player instance.
SimpleExoPlayer player = new SimpleExoPlayer.Builder(context).build();
// Set the media item to be played.
player.setMediaItem(MediaItem.fromUri("https://stream.mux.com/{PLAYBACK_ID}.m3u8"));
// Prepare the player.
player.prepare();

```

```embed

<iframe
  src="https://player.mux.com/{PLAYBACK_ID}?metadata-video-title=Test%20video%20title&metadata-viewer-user-id=user-id-007"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html

<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>

<mux-player
  playback-id="{PLAYBACK_ID}"
  metadata-video-title="Test video title"
  metadata-viewer-user-id="user-id-007"
></mux-player>

```

```react

import MuxPlayer from '@mux/mux-player-react';

export default function VideoPlayer() {
  return (
    <MuxPlayer
      playbackId="{PLAYBACK_ID}"
      metadata={{
        video_id: "video-id-54321",
        video_title: "Test video title",
        viewer_user_id: "user-id-007",
      }}
    />
  );
}

```

```swift

import SwiftUI
import AVKit

private let playbackURL: URL = {
    guard let url = URL(string: "https://stream.mux.com/{PLAYBACK_ID}.m3u8") else {
        preconditionFailure("Invalid playback URL")
    }
    return url
}()

struct ContentView: View {
    private let player = AVPlayer(url: playbackURL)

    var body: some View {
        // VideoPlayer comes from SwiftUI.
        VideoPlayer(player: player)
            .onAppear {
                player.play()
            }
    }
}

```



## 4. Find a player

The examples below are meant to be a starting point. You are free to use **any player that supports HLS** with Mux videos. Here's some popular players that we have seen:

### Mux Player

```embed

<iframe
  src="https://player.mux.com/{PLAYBACK_ID}?metadata-video-title=Test%20video%20title&metadata-viewer-user-id=user-id-007"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html

<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>

<mux-player
  playback-id="{PLAYBACK_ID}"
  metadata-video-title="Test video title"
  metadata-viewer-user-id="user-id-007"
></mux-player>

```

```react

import MuxPlayer from '@mux/mux-player-react';

export default function VideoPlayer() {
  return (
    <MuxPlayer
      playbackId="{PLAYBACK_ID}"
      metadata={{
        video_id: "video-id-54321",
        video_title: "Test video title",
        viewer_user_id: "user-id-007",
      }}
    />
  );
}

```



See [the Mux Player guide](/docs/guides/mux-player-web) for more details and configuration options.

### Mux Video Element

If Mux Player does more than you're looking for, and you're interested in using something more like the native HTML5 `<video>` element for your web application, take a look at the `<mux-video>` element. The Mux Video Element is a drop-in replacement for the HTML5 `<video>` element, but it works with Mux and has Mux Data automatically configured.

* HTML: [Mux Video element](https://github.com/muxinc/elements/tree/main/packages/mux-video)
* React: [MuxVideo component](https://github.com/muxinc/elements/tree/main/packages/mux-video-react)

### Popular web players

* [HLS.js](https://github.com/video-dev/hls.js) is free and open source. This library does not have any UI components like buttons and controls. If you want to either use the HTML5 `<video>` element's default controls or build your own UI elements HLS.js will be a great choice.
* [Plyr.io](https://plyr.io/) is free and open source. Plyr has UI elements and controls that work with the underlying `<video>` element. Plyr does not support HLS by default, but it can be used *with* [HLS.js](https://github.com/video-dev/hls.js). If you like the feel and theming capabilities of Plyr and want to use it with Mux videos, follow the [example for using Plyr + HLS.js](https://codepen.io/pen?template=oyLKQb).
* [Video.js](https://videojs.com/) is a free and open source player. As of version 7 it supports HLS by default. The underlying HLS engine is [videojs/http-streaming](https://github.com/videojs/http-streaming).
* [JWPlayer](https://www.jwplayer.com/html5-video-player/) is a commercial player and supports HLS by default. The underlying HLS engine is HLS.js.
* [Brightcove Player](https://player.support.brightcove.com/getting-started/overview-brightcove-player.html) is a commercial player built on Video.js and HLS is supported by default.
* [Bitmovin Player](https://bitmovin.com/video-player/) is a commercial player and supports HLS by default.
* [THEOplayer](https://www.theoplayer.com/) is a commercial player and supports HLS by default. The player chrome is built on Video.js, but the HLS engine is custom.
* [Agnoplay](https://www.agnoplay.com/) is a fully agnostic, cloud-based player solution for web, iOS and Android with full support for HLS.

### Use Video.js with Mux

Video.js kit is a project built on [Video.js](https://videojs.com) with additional Mux specific functionality built in.
This includes support for:

* Enabling [timeline hover previews](/docs/guides/create-timeline-hover-previews)
* [Mux Data integration](/docs/guides/monitor-video-js)
* `playback_id` helper (we'll figure out the full playback URL for you)

For more details, head over to the [Use Video.js with Mux](/docs/guides/playback-videojs-with-mux) page.

## 5. Advanced playback features

## Playback with subtitles/closed captions

Subtitles/Closed Captions text tracks can be added to an asset either on asset creation or later when they are available. Mux supports [SubRip Text (SRT)](https://en.wikipedia.org/wiki/SubRip) and [Web Video Text Tracks](https://www.w3.org/TR/webvtt1/) format for ingesting Subtitles and Closed Captions text tracks. For more information on Subtitles/Closed Captions, see this [blog post](https://mux.com/blog/subtitles-captions-webvtt-hls-and-those-magic-flags) and [the guide for subtitles](/docs/guides/add-subtitles-to-your-videos).

Mux includes Subtitles/Closed Captions text tracks in HLS (.m3u8) for playback. Video Players show the presence of Subtitles/Closed Captions text tracks and the languages available as an option to enable/disable and to select a language. The player can also default to the viewer's device preferences.

<Image src="/docs/images/hls-player-options-menu.png" width={768} height={454} caption="HLS.js video player options menu for Subtitles/Closed Captions text track" />

If you are adding text tracks to your Mux videos, make sure you test them out with your player.

In addition, Mux also supports downloading of Subtitles/Closed Captions text tracks as "sidecar" files when [downloading your videos](/docs/guides/download-your-videos).

```
https://stream.mux.com/{PLAYBACK_ID}/text/{TRACK_ID}.vtt
```

Replace `{PLAYBACK_ID}` with your asset's playback ID and `{TRACK_ID}` with the unique identifier value returned when this subtitle/closed caption text track was added to this asset.

## Add delivery redundancy with Redundant Streams

Mux Video streams are delivered using multiple CDNs. The best performing CDN is selected for the viewer initiating the playback. Video is then streamed by that CDN for that particular user. When the selected CDN has a transient or regional failure, the viewer's playback experience could be interrupted for the duration of the failure. If this happens your application should handle the playback failure and re-initiate the playback session. Mux Video's CDN selection logic would then select a different CDN for streaming.

The redundant streams modifier allows Mux to list each rendition for every CDN in the HLS manifest. The order is based on CDN performance with the best performing one listed first. If your video player supports redundant streams then the player will detect the failure mid-playback and switch to the next CDN on the list during a failure without interrupting the playback.

For more information on the Redundant Streams playback modifier and player support based on our tests, see [this blog post](https://mux.com/blog/survive-cdn-failures-with-redundant-streams/).

To use this feature in your application add `redundant_streams=true` to the HLS URL:

```none
https://stream.mux.com/{PLAYBACK_ID}.m3u8?redundant_streams=true
```

<Callout type="warning">
  # Using `redundant_streams` with signed URLs

  If you are using [signed playback URLs](/docs/guides/secure-video-playback) make sure you include the extra parameter in your signed token.
</Callout>

This table shows the support of various video players for redundant streams. This table will be updated as more players are tested or updated. If your player isn't listed here, please reach out.

| Player | Version | Manifest 4xx | Manifest 5xx | Media 4xx | Media 5xx |
| :-- | :-- | :-- | :-- | :-- | :-- |
| Video.js | >= 7.6.6 |✅ |✅ |✅ |✅ |
| HLS.js | >= 0.14.11 |✅ |✅ |✅ |✅ |
| JWPlayer | Production Release Channel |✅ |✅ |✅ |✅ |
| Safari iOS (AVPlayer) | >= iOS 13.6.1 |✅ |✅ |✅ |✅ |
| Safari MacOS | Safari >= 13.1.2 MacOS 10.15.X |✅ |✅ |✅ |✅ |
| ExoPlayer | >= r2.12.0 |✅ |✅ |✅ |✅ |

## Securing video playback

When using a policy of `"public"` for your playback IDs, your HLS playback URLs will work for as long as the playback ID exists. If you use a `"signed"` policy then you can have more control over playback access. This involves creating signing keys and using JSON web tokens to generate signatures on your server. See the guide for [secure video playback](/docs/guides/secure-video-playback).

## 6. Next steps

<GuideCard
  title="Get images from a video"
  description="Now that you have playback working, build rich experiences into your application by previewing your videos with thumbnails and gifs."
  links={[
    {title: "Read the guide", href: "/docs/guides/get-images-from-a-video"},
  ]}
/>

<GuideCard
  title="Track your video performance"
  description="Add the Mux Data SDK to your player and start collecting playback performance metrics."
  links={[
    {title: "Read the guide", href: "/docs/guides/track-your-video-performance"},
  ]}
/>


# Lazy-loading Mux Player
Improve your users' page load experience by lazy-loading the Mux Player.
## Installation

### React

After [installing `@mux/mux-player-react`](/docs/guides/player-integrate-in-your-webapp), import Mux Player React Lazy from `@mux/mux-player-react/lazy`:

Depending on your bundler your import might look a little different. If you're having trouble with the import try:

* `@mux/mux-player-react/lazy`
* `@mux/mux-player-react/dist/lazy.mjs`
* `@mux/mux-player-react/dist/lazy`

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxPlayer from \"@mux/mux-player-react/dist/lazy.mjs\"; \n\nexport default function App() {\n  return (\n    <>\n      <p style={{ backgroundColor: \"#eee\", height: \"100vh\" }}>\n        Scroll down to see Mux Player load lazily.\n      </p>\n      <MuxPlayer\n        playbackId=\"a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M\"\n        metadata={{\n          video_id: \"video-id-54321\",\n          video_title: \"Test video title\",\n          viewer_user_id: \"user-id-007\",\n        }}\n        style={{ aspectRatio: 16/9 }}\n      />\n    </>\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "template": "react"
}
```

<Callout type="info">
  Mux Player React Lazy will not be available if you are using the hosted option
  on jsdelivr.com.
</Callout>

## Preventing cumulative layout shift

Because the player is added to the DOM after the page loads, it will cause a [cumulative layout shift](https://web.dev/cls), pushing content down and causing a jarring jump for your users. To prevent this, make sure your player has an `aspectRatio` style property. `@mux/mux-player-react/lazy` will display a placeholder with this aspect ratio while the player loads.

```jsx
<MuxPlayer
  playbackId="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
  // without this line, the player will cause a layout shift when it loads
  style={{ aspectRatio: 16/9 }}
/>
```

## Customizing the placeholder

While Mux Player React Lazy loads, it will display a placeholder with the same background color as the player. (By default, a black background).

<Player playbackId="Wd01CoLZp2Adx00qefHtyGVPSP2h4wO33OZqR00vf7wCnQ" style={{ aspectRatio: "495 / 274", '--center-controls': 'none' }} />

If the `placeholder=` attribute is defined, the attribute's contents will display in the placeholder before load. You can generate placeholders that match your video poster with `@mux/blurup`. [See the placeholder guide to learn more](/docs/guides/player-customize-look-and-feel#provide-a-placeholder-while-the-poster-image-loads).

<Player playbackId="bXA3Oh7v22fRBU013damYqUxFK6HrmJcrI00Q00b2OSvmc" style={{ aspectRatio: "656 / 277", '--center-controls': 'none' }} />

## Defining when to load

In addition to the standard attributes that Mux Player React accepts, Mux Player React Lazy will also accept a `loading` attribute:

* `loading="page"`: Loads the player and replaces a placeholder after the page loads and the initial JavaScript bundle is executed
* `loading="viewport"`: (Default) Extends `loading="page"` by also waiting until the placeholder has entered the viewport

## Using other frameworks

If you are working in an environment that supports [dynamic imports](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/import), like [Webpack](https://webpack.js.org/guides/code-splitting/), [Rollup](https://rollupjs.org/guide/en/#code-splitting), [Parcel](https://parceljs.org/features/code-splitting/), or [many modern browsers](https://caniuse.com/es6-module-dynamic-import), you can reproduce the behavior of Mux Player React Lazy.

If you have access to a Node.js server, generate a placeholder that matches your video with `@mux/blurup`.

```js
// Server-Side
import { createBlurUp } from '@mux/blurup';

const options = {};
const muxPlaybackId = 'O6LdRc0112FEJXH00bGsN9Q31yu5EIVHTgjTKRkKtEq1k';

const getPlaceholder() = async () => {
  const { blurDataURL, aspectRatio } = await createBlurUp(muxPlaybackId, options);
  console.log(blurDataURL, aspectRatio);
  // data:image/svg+xml;charset=utf-8,<svg xmlns="http://www.w3.org/2000/svg" width="100%" ...
};
```

Then, use a dynamic import to load Mux Player. When the load is complete, replace the placeholder with the player.

```html

<div class="wrapper">
  <div class="placeholder"></div>
</div>

<script>
const wrapper = document.querySelector(".wrapper");
const placeholder = document.querySelector(".placeholder");

import("@mux/mux-player").then(() => {
  const player = document.createElement("mux-player");

  player.setAttribute("playback-id", playbackId);
  player.setAttribute("placeholder", blurUpPlaceholder);
  player.setAttribute("metadata-video-title", "Test video title");
  player.setAttribute("metadata-viewer-user-id", "user-id-007");

  wrapper.replaceChild(player, placeholder);
});
</script>

<style>
.wrapper {
  aspect-ratio: {sourceWidth} / {sourceHeight};
  width: 100%;
  position: relative;
}
mux-player, .placeholder {
  position: absolute;
  inset: 0;
}
.placeholder {
  background-image: url({blurUpPlaceholder});
  background-color: black;
  background-size: contain;
  background-repeat: no-repeat;
}
</style>

```

```svelte

<script>
  const player = import('@mux/mux-player');
</script>

<main>
  <div class="wrapper" style:aspect-ratio="{sourceWidth / sourceHeight}">
    {#await player}
      <div class="placeholder" style:background-image="url('{data.blurUpPlaceholder}')" />
    {:then}
      <mux-player
        playback-id={playbackId}
        placeholder={blurUpPlaceholder}
        metadata-video-title="Test VOD"
        metadata-viewer-user-id="user-id-007"
      />
    {/await}
  </div>
</main>

<style>
  .wrapper {
    width: 100%;
    position: relative;
  }
  mux-player, .placeholder {
    position: absolute;
    inset: 0;
  }
  .placeholder {
    background-color: black;
    background-size: contain;
    background-repeat: no-repeat;
  }
</style>


```



# Running ads with Mux Player
Monetize your videos by running ads in Mux Player with the Google IMA SDK.
Mux Player doesn’t have a built-in way to integrate ads. We will show you how this can be achieved though using client side ad insertion with the Google IMA SDK. This guide will demonstrate how you can enable [preroll](https://www.mux.com/video-glossary/preroll) ads, but [midroll](https://www.mux.com/video-glossary/midroll) and [postroll](https://www.mux.com/video-glossary/postroll) ads could also be achieved using this approach.

If you're unfamiliar with the Google IMA SDK, we recommend reading the [documentation](https://developers.google.com/interactive-media-ads/docs/sdks/html5/client-side) as well as the examples in [this repository](https://github.com/googleads/googleads-ima-html5).

Within web video, ad insertion typically comes in two flavors:

* **SSAI - Server Side Ad Insertion**: A mechanism to insert advertisements into the linear video stream so that it’s played out *without* any other needed technology on the viewing side.
* **CSAI - Client Side Ad Insertion**: A method whereby the video player requests an ad from an ad server via the video player located inside an application or website. When the ad server has received the ad request from the video player, it sends back an ad and displays it inside the video content.

<Callout type="info">
  The following guide is using vanilla JS, however it can applied to any popular framework (React, Angular, etc.)
</Callout>

## 1. Set up Mux Player

First, make sure you have Mux Player set up on your webpage. Include the following CDN links:

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>
```

Then add Mux Player within a containing div element. This div element will act as a container for both Mux Player and the Ad layer

```html
<div id="mainContainer">
    <mux-player
    playback-id="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
    metadata-video-title="Test VOD"
    metadata-viewer-user-id="user-id-007"
    ></mux-player>
</div>
```

<Callout type="info">
  [See the related player documentation](/docs/guides/mux-player-web)
</Callout>

## 2. Include the Google IMA SDK

```html
<script src="//imasdk.googleapis.com/js/sdkloader/ima3.js"></script>
```

<Callout type="warning">
  While developing locally, the newer versions of Google Chrome might block the loading of this script because of the lack of HTTPS/SSL support
</Callout>

## 3. Create an ad container

We need to add two container elements. One to contain the ad itself, which will overlay Mux Player, and another to wrap around both of them

```html
<div id="mainContainer">
    <mux-player
    playback-id="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
    metadata-video-title="Test VOD"
    metadata-viewer-user-id="user-id-007"
    ></mux-player>
    <div id="ad-container"></div>
</div>
```

Now we add some CSS to position the ad on top of the player. It's important that the `ad-container` is the exact same size as the player element, so that ads are displayed at the same size as the video.

```css
mux-player {
    width: 640px;
    height: 360px;
}

#mainContainer {
    position: relative;
    width: 640px;
    height: 360px;
}

#ad-container {
    position: absolute;
    top: 0;
    left: 0;
    width: 100%;
    height: 100%;
    z-index: 1000;
}
```

## 4. Initialize the IMA SDK and handle playback

Initialize the IMA SDK and configure it to use the ad container element we created earlier. The code below initializes the Google IMA SDK for client-side ad insertion. It creates an AdDisplayContainer object, which is used to overlay ads on top of Mux Player. Then, an AdsLoader object is created, which loads the IMA SDK ads. Create an AdsRequest object and request ads using the IMA SDK. You’ll need to have an ad tag URL from your ad server.

```javascript
let muxPlayer = document.querySelector('mux-player'); // initialize for later use

const adDisplayContainer = new google.ima.AdDisplayContainer(document.getElementById('ad-container'));
const adsLoader = new google.ima.AdsLoader(adDisplayContainer);
let adsManager; // initializes the adsManager that get's utilized in later event handlers

adsLoader.addEventListener(google.ima.AdsManagerLoadedEvent.Type.ADS_MANAGER_LOADED, onAdsManagerLoaded, false);
adsLoader.addEventListener(google.ima.AdErrorEvent.Type.AD_ERROR, onAdError, false);

let adsRequest = new google.ima.AdsRequest();
adsRequest.adTagUrl = 'https://pubads.g.doubleclick.net/gampad/ads?iu=/21775744923/external/single_ad_samples&sz=640x480&cust_params=sample_ct%3Dlinear&ciu_szs=300x250%2C728x90&gdfp_req=1&output=vast&unviewed_position_start=1&env=vp&impl=s&correlator=';

// The above is a testing preroll ad. Please fill in your tag URL from your ad server.

adsRequest.linearAdSlotWidth = 640;
adsRequest.linearAdSlotHeight = 360;
adsRequest.nonLinearAdSlotWidth = 640;
adsRequest.nonLinearAdSlotHeight = 150;

adsLoader.requestAds(adsRequest);
```

When ads are loaded, initialize the AdsManager to start playing them. This sets up the Google IMA SDK's AdsManager and attaches event listeners to handle ad events. It also initializes and starts the AdsManager, ensuring that ads are played correctly. If an error occurs during initialization or ad playback, the content will be played instead.

```javascript
function onAdsManagerLoaded(adsManagerLoadedEvent) {
    let adsRenderingSettings = new google.ima.AdsRenderingSettings();
    adsManager = adsManagerLoadedEvent.getAdsManager(muxPlayer, adsRenderingSettings);

    // Add event listeners to the ads manager here
    adsManager.addEventListener(google.ima.AdErrorEvent.Type.AD_ERROR, onAdError);
    adsManager.addEventListener(google.ima.AdEvent.Type.CONTENT_PAUSE_REQUESTED, onContentPauseRequested);
    adsManager.addEventListener(google.ima.AdEvent.Type.CONTENT_RESUME_REQUESTED, onContentResumeRequested);

    try {
        adsManager.init(640, 360, google.ima.ViewMode.NORMAL);
        adsManager.start();
    } catch (adError) {
        muxPlayer.play(); // If ad fails, continue with the content
    }
}

function onAdError(adErrorEvent) {
    console.log(adErrorEvent.getError());
    if (adsManager) {
        adsManager.destroy();
    }
    muxPlayer.play(); // Continue with the content
}

function onContentPauseRequested() {
    muxPlayer.pause();
}

function onContentResumeRequested() {
    muxPlayer.play();
}
```

<Callout type="info">
  Under `adsManager.init(640, 360, google.ima.ViewMode.NORMAL)`, the values of 640 and 360 should match the values of the ad and main container dimensions. Otherwise you will see some unwanted results with the ad dimensions
</Callout>

## 5. Link Mux Player events with IMA SDK

These event listeners synchronize ad playback with video playback, ensuring that everything is tied together.

```javascript
muxPlayer.addEventListener('play', function () {
    adDisplayContainer.initialize();
});

muxPlayer.addEventListener('pause', function () {
    if (adsManager) {
        adsManager.pause();
    }
});

muxPlayer.addEventListener('playing', function () {
    if (adsManager) {
        adsManager.resume();
    }
});
```

## 6. Start the ad and content

Ensure the ad is initialized and content plays accordingly.

<Callout type="warning">
  Please be mindful when testing if you're using an adblocker as you will not receive any ads
</Callout>


# Mux Player for iOS
Learn how to use Mux Player SDK to play video delivered by Mux in your iOS or iPadOS application
This guide will help you install the Mux Player SDK in your native iOS or iPadOS application. If you encounter any issues please let us know by filing [an issue on Github](https://github.com/muxinc/mux-player-swift).

## Install the SDK

Let's start by installing the SDK. We'll use the Swift Package Manager. [Step-by-step guide on using Swift Package Manager in Xcode](https://developer.apple.com/documentation/xcode/adding-package-dependencies-to-your-app).

Open your applications project in Xcode. In the Xcode menu bar select File > Add Packages. In the top-right corner of the modal window that opens enter the SDK repository URL which is `https://github.com/muxinc/mux-player-swift`.

By default Xcode will fetch the latest version of the SDK available on the `main` branch. If you need a specific package version or to restrict the range of package versions used in your application, select a different Dependency Rule. [Here's an overview of the different SPM Dependency Rules and their semantics](https://developer.apple.com/documentation/xcode/adding-package-dependencies-to-your-app#Decide-on-package-requirements).

Click on Add Package to begin resolving and downloading the SDK package. When completed, select your application target as the destination for the `MuxPlayerSwift` package product. To use the SDK in your application, import it's module.

```swift
import MuxPlayerSwift
```

## Stream video from a Mux asset

Use `MuxPlayerSwift` to setup an `AVPlayerViewController` or `AVPlayerLayer` that can download and stream a Mux asset with only a playback ID. The SDK will also enable Mux Data monitoring to help you measure the performance and quality of your application's video experiences.

```swift
/// After you're done testing, you can check this video out to learn more about video and players (as well as some philosophy)
let playbackID = "qxb01i6T202018GFS02vp9RIe01icTcDCjVzQpmaB00CUisJ4"

/// Prepare an AVPlayerViewController to stream and monitor a Mux asset, configured with the playback ID.
let playerViewController = AVPlayerViewController(playbackID: playbackID)

/// Prepare an AVPlayerLayer to stream and monitor a Mux asset, configured with the playback ID.
let playerLayer = AVPlayerLayer(playbackID: playbackID)
```

Your application can customize how Mux Video delivers video to the player using playback URL modifiers. A playback URL modifier is appended as a query parameter to a public playback URL. The `MuxPlayerSwift` exposes a type-safe Swift API that constructs these URLs.

```swift
/// After you're done testing, you can check out this video out to learn more about video and players (as well as some philosophy)
let playbackID = "qxb01i6T202018GFS02vp9RIe01icTcDCjVzQpmaB00CUisJ4"

/// Create playback options to limit resolution up to 720p.
let playbackOptions = PlaybackOptions(maximumResolutionTier: .upTo720p)

/// Prepare an AVPlayerViewController to stream and monitor a Mux asset with playback options.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

The example above delegated constructing the playback URL and appending playback modifiers to the SDK.

When using the [`AVPlayerViewController`](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/avkit/avplayerviewcontroller/#initializers) or [`AVPlayerLayer`](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/avfoundation/avplayerlayer#initializers) convenience initializers provided by the `MuxPlayerSwift` there are no required steps to enable Mux Data monitoring for video streamed from a Mux playback URL.

[See the below section](/docs/guides/mux-player-ios#monitor-media-playback) for more details and how to customize Mux Data monitoring.

### AVPlayerLayer-backed views

If you're using `UIView` that is backed by an `AVPlayerLayer` to display video, `MuxPlayerSwift` [exposes APIs](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/avfoundation/avplayerlayer#instance-methods) to setup an existing `AVPlayerLayer` for playback.

```swift
/// Prepare an already initialized AVPlayerLayer to stream and monitor a Mux asset
func preparePlayerLayer(playbackID: String, in playerView: UIView) {
  // Check to make sure the player view backing layer is of the correct type and get a reference if it is.
  guard let playerLayer = playerView.layer as? AVPlayerLayer else {
    return
  }

  // Prepares the player layer to stream media and monitor playback with Mux Data.
  playerLayer.prepare(playbackID: playbackID)
}
```

Your application can also customize playback or monitoring with Mux Data by using the same parameters as shown for the `AVPlayerLayer` initializers above.

## Monitor media playback

By default Mux Data metrics will be populated in the same environment as your playback ID.  [Learn more about Mux Data metric definitions here](/docs/guides/understand-metric-definitions).

Read on for additional (and optional) setup steps to modify or extend the information Mux Data tracks.

Use [`MonitoringOptions`](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/monitoringoptions) to set custom monitoring-related parameters.

If you're already using the Mux Data SDK for AVPlayer [this initializer](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/monitoringoptions/init\(customerdata:playername:\)) allows you to any of your existing logic for constructing `MUXSDKCustomerData`.

```swift
let playbackID = "YOUR_PLAYBACK_ID"
let customEnvironmentKey = "ENV_KEY"

// Configure custom Mux Data player metadata.
let playerData = MUXSDKCustomerPlayerData()
playerData.environmentKey = customEnvironmentKey

// Configure custom Mux Data video metadata.
let videoData = MUXSDKCustomerVideoData()
videoData.videoTitle = "Video Behind the Scenes"
videoData.videoSeries = "Video101"

// Combine metadata into customer data for monitoring.
let customerData = MUXSDKCustomerData()
customerData.customerPlayerData = playerData
customerData.customerVideoData = videoData

// Build monitoring options and create the player.
let monitoringOptions = MonitoringOptions(customerData: customerData, playerName: "MyPlayer1")
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  monitoringOptions: monitoringOptions
)
```

## Secure your playback experience

Mux Video offers several levels of playback access control. [See here for more](/docs/guides/secure-video-playback).

### Signed Playback URLs

`MuxPlayerSwift` supports playback of assets enabled for access with signed playback URLs. Playing back assets with a signed playback policy requires the player to include a valid and unexpired JSON Web Token (JWT) when requesting media from Mux.

Your application should generate and sign the JWT in a trusted environment that you control like a server-side application. As a security measure any playback modifiers must be passed through the JWT as additional claims.

Once your application is in possession of the JWT, it can begin streaming.

To start playback, use the JWT to initialize `PlaybackOptions`. Then, initialize `AVPlayerViewController` or `AVPlayerLayer` with your playback ID, as in prior examples.

```swift
let playbackID = "YOUR_PLAYBACK_ID"
let playbackToken = "YOUR_PLAYBACK_TOKEN"

/// Prepare an AVPlayerViewController to stream and monitor a Mux asset
/// with a playback ID that has a signed playback policy.
let playbackOptions = PlaybackOptions(playbackToken: playbackToken)
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

```swift
let playbackID = "YOUR_PLAYBACK_ID"
let playbackToken = "YOUR_PLAYBACK_TOKEN"

/// Prepare an AVPlayerLayer to stream and monitor a Mux asset
/// with a playback ID that has a signed playback policy.
let playbackOptions = PlaybackOptions(playbackToken: playbackToken)
let playerLayer = AVPlayerLayer(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

```swift
/// Prepare an already initialized AVPlayerLayer to stream and monitor a Mux asset
/// with a playback ID that has a signed playback policy.
func preparePlayerLayer(playbackID: String, playbackToken: String, in playerView: UIView) {
  let playbackOptions = PlaybackOptions(playbackToken: playbackToken)

  // Check to make sure the player view backing layer is of the correct type and get a reference if it is.
  guard let playerLayer = playerView.layer as? AVPlayerLayer else {
    return
  }

  // Prepares the player layer to stream media and monitor playback with Mux Data.
  playerLayer.prepare(playbackID: playbackID, playbackOptions: playbackOptions)
}
```

<Callout type="info">
  If your JWT includes a playback restriction, Mux will not be able perform domain validation when the playback URL is loaded by `AVPlayer` because no referrer information is supplied.

  To allow `AVPlayer` playback of referrer restricted assets set the `allow_no_referrer` boolean parameter to `true` when creating a playback restriction. Conversely, a playback restriction with `allow_no_referrer` to `false` will disallow `AVPlayer` playback. [See here for more](/docs/guides/secure-video-playback#using-referer-http-header-for-validation).
</Callout>

### Digital Rights Management

DRM (Digital Rights Management) provides an extra layer of content security for video content streamed from Mux.
For more details see the [Protect Your Videos with DRM guide](/docs/guides/protect-videos-with-drm).

<Callout type="warning">
  Mux uses the industry standard FairPlay protocol for delivering DRM'd video content to native iOS and iPadOS applications. To play back DRM'd content on these platforms, you'll need to obtain a FairPlay deployment package (also called a “Fairplay certificate”), [see more details from Apple here](/docs/guides/protect-videos-with-drm#prerequisites-for-fairplay-drm-on-apple-devices). *Without this, DRM’d content will not be playable in your application on these platforms.*
</Callout>

```swift
let playbackID = "YOUR_PLAYBACK_ID"
let playbackToken = "YOUR_PLAYBACK_TOKEN"
let drmToken = "YOUR_DRM_TOKEN"

/// Configure playback options for a playback ID configured for DRM.
let playbackOptions = PlaybackOptions(
  playbackToken: playbackToken,
  drmToken: drmToken
)

/// Prepare an AVPlayerViewController to stream and monitor a Mux asset.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

## Customize playback

## More tools to control playback behavior

### Restrict Resolution Range

Mux Video gives you extra control over the available resolutions of your video.

`MuxPlayerSwift` exposes convenience APIs to adjust the [maximum](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/maxresolutiontier) and [minimum](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/minresolutiontier) resolutions if they are available.

Setting a maximum resolution helps reduce delivery costs while setting a minimum resolution helps ensure visual quality of your video. Maximum and minimum resolutions can be set independently or composed together.

<Callout type="info">
  If you're using signed URLs, you'll need to embed `min_resolution` and `max_resolution` into the JWT claims. [Full documentation available here](/docs/guides/modify-playback-behavior#jwt-claim-with-signed-playback-url).
</Callout>

This example restricts the resolution range `AVPlayer` requests to be between 720p and 1080p.

```swift
let playbackID = "YOUR_PLAYBACK_ID"

// Configure playback options to request renditions between 720p and 1080p.
let playbackOptions = PlaybackOptions(
  maximumResolutionTier: .upTo1080p,
  minimumResolutionTier: .atLeast720p
)

// Prepare an AVPlayerViewController to stream a Mux asset with this range.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

This example restricts the resolution `AVPlayer` requests to a single fixed resolution of 720p.

```swift
let playbackID = "YOUR_PLAYBACK_ID"

// Configure playback options to request only the 720p rendition.
let playbackOptions = PlaybackOptions(singleRenditionResolutionTier: .only720p)

// Prepare an AVPlayerViewController to stream a Mux asset at a fixed 720p resolution.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

### Adjust Resolution Selection

When `AVPlayer` requests delivery of HLS content, it first downloads a series of text files, commonly called manifests or playlists, to describe the available qualities of a video and the location of all the segments that comprise the video.

The order of renditions in the initial manifest or playlist can influence the resolution level a player selects. When using the Mux Player Swift SDK your application can manipulate that order in the [`PlaybackOptions`](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/playbackoptions) provided to `AVPlayerViewController` or `AVPlayerLayer`.

If your Mux asset is publicly playable, specify a [`RenditionOrder`](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/renditionorder).

```swift
let playbackID = "YOUR_PLAYBACK_ID"

// Configure playback options to prioritize higher renditions first.
let playbackOptions = PlaybackOptions(renditionOrder: .descending)

// Prepare an AVPlayerViewController to stream a Mux asset with descending rendition order.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

The available values for rendition order are [listed here](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/renditionorder#enumeration-cases).

<Callout type="info">
  If using signed playback URLs in your application, you'll need "rendition\_order" : "desc" for descending rendition order into your JWT claims. [Full documentation available here](/docs/guides/modify-playback-behavior#jwt-claim-with-signed-playback-url).
</Callout>

See [the Mux blog](https://www.mux.com/blog/more-tools-to-control-playback-behavior-min-resolution-and-rendition-order) and [this guide](/docs/guides/control-playback-resolution#using-playback-modifiers-to-manipulate-playback-resolution) for more on these options.

### Instant Clips

Instant clips are an alternative to our [long-standing asset-based clipping feature](/docs/guides/create-clips-from-your-videos). Requesting instant clips using relative time is now [available for use with all video on-demand (VOD) assets](https://www.mux.com/blog/instant-clipping-update).

Instant clipping allows you to request a stream whose start time is at some later point in the video, relative to the start time of the asset. Likewise you're able to request a stream that ends sooner than when the underlying asset completes. Instant clips do not incur the wait time or expense of a creating a new asset.

Unlike asset-based clipping, instant clipping is done by trimming your VOD assets HLS manifest. This means that instant clipping operates at the segment level of accuracy. You should expect that the content that you clip out may be several seconds longer than you’ve requested. We always make sure to include the timestamps that you request, but your content may start a few seconds earlier, and end a few seconds later.

Assets that originate from a livestream can also be converted into instant clips using program date time epochs. Support for these clips will be available in a future Mux Player Swift release.

```swift
let playbackID = "YOUR_PLAYBACK_ID"

// Configure instant clipping for a highlight clip of your publicly viewable asset.
// The clip starts about 10 seconds after your asset starts and finishes approximately 10 more seconds after that.
// A few extra seconds of video may be included in the clip.
let clipping = InstantClipping(assetStartTimeInSeconds: 10, assetEndTimeInSeconds: 20)
let playbackOptions = PlaybackOptions(clipping: clipping)

// Create an AVPlayerViewController to stream the instant clip.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

<Callout type="info">
  If using signed playback URLs in your application, you'll need to include `asset_start_time` and `asset_end_time` keys in your JWT claims to enable instant clipping. [Full documentation available here](/docs/guides/modify-playback-behavior#jwt-claim-with-signed-playback-url).
</Callout>

## Improve playback performance

`AVPlayer` supports playing back HTTP live streams without pre-loading. When playing the same video more than once in this case, `AVPlayer` does not provide a means for making sure cached video files are used for subsequent playback. This may result in higher network throughput and additional video delivery cost when the same video is played more than once.

One alternative is to [enable static renditions for your Mux asset](/docs/guides/enable-static-mp4-renditions) that your application can download and persist using `URLSession` and `FileManager` APIs.

Another is to first preload streams. As with static renditions, it requires your application to manage downloads and handling storage. (See the [Apple offline playback and storage documentation](https://developer.apple.com/documentation/avfoundation/offline_playback_and_storage?language=objc) for more details on pre-loading).

Mux Player Swift offers a caching mechanism for streams that only provide a single rendition to the player. This cache is primarily tested against and provides the most cost-savings benefit when constraining playback to a single rendition. Your application is required to select a resolution tier to which playback will be constrained when using the smart cache.

This avoids manual intervention required by the previous two options.

Like a browser’s cache, cached segments can be used by all players in your application, as long as they’re configured to use the smart cache.  If your application uses multiple player instances, you can enable smart cache for them by using the convenience initializers provided by the SDK.

The example below enables the smart cache with playback at single fixed resolution.

```swift
let playbackID = "YOUR_PLAYBACK_ID"

// Select a single rendition tier for smart caching (for example, only 720p).
// Available values: https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/singlerenditionresolutiontier
let singleRenditionResolutionTier: SingleRenditionResolutionTier = .only720p

// Enable smart cache and constrain playback to the selected single rendition tier.
let playbackOptions = PlaybackOptions(
  enableSmartCache: true,
  singleRenditionResolutionTier: singleRenditionResolutionTier
)

// Prepare an AVPlayerViewController to stream a Mux asset with smart caching enabled.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

<Callout type="info">
  The smart cache will be automatically purged after your application is terminated. It may be purged by the operating system at any time when your application is suspended in the background.
</Callout>

Mux supports HLS video delivery with Transport Stream segments or Common Media Application Format (CMAF) chunks. Both are supported by the smart cache. [See this explainer for a general introduction to video delivery](https://howvideo.works/#delivery).


# Control playback resolution
Control the video resolution your users receive in order to give the best user experience as well as take advantage of Mux's resolution based pricing.
# Default playback URL

The default playback URL will contain all available resolutions of your video. The resolutions available will depend on the video source file.

By default if the source file contains 1080p or higher, then the highest resolution provided by Mux will be 1080p. If the source file is lower than 1080p, the highest resolution available will be the resolution of the source.

You can also stream 4K content using Mux Video, which will be delivered at higher resolutions including 2.5K and 4K. For more details see the [guide to streaming 4K videos](/docs/guides/stream-videos-in-4k).

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8
```

Use the default playback URL for most use cases. The video player will determine the best resolution based on the available bandwidth of the viewer.

# Using playback modifiers to manipulate playback resolution

Mux exposes a set of [playback modifiers](/docs/guides/modify-playback-behavior), which give you extra control over the availiable resolutions of your content.

## Specify maximum resolution

The playback URL below with the `max_resolution` query parameter modifies the resolutions available for the player to choose from.

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8?max_resolution=720p
```

The `max_resolution` parameter can be set to `270p`, `360p`, `480p`, `540p`, `720p`, `1080p`, `1440p`, or `2160p`. You may want to do this in order to reduce your delivery costs, or build a feature to your product where only certain viewers get lower resolution video.

*Please note that not all resolutions are available for all assets. If you specify a max resolution that is not available for the asset, Mux will limit the resolution to the highest resolution available below the one you specified. For example, if you specify `max_resolution=1080p` but the highest resolution available for the asset is 720p, then the manifest will be capped at 720p.*

## Specify minimum resolution

The playback URL below with the `min_resolution` query parameter modifies the resolutions available for the player to choose from.

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8?min_resolution=720p
```

The `min_resolution` parameter can be set to `270p`, `360p`, `480p`, `540p`, `720p`, `1080p`, `1440p`, or `2160p`. You may want to use this to omit the lowest quality renditions from the HLS manifest when the visual quality of your content is critical to the delivery, for example in live streams where detailed screen share content is present.

*Please note that not all resolutions are available for all assets. If you specify a min resolution that is not available for the asset, Mux will limit the resolution to the next highest resolution available below the one you specified. For example, if you specify `max_resolution=270p` but the lowest resolution available for the asset is 360p, then the manifest will start at at 360p.*

## Specify rendition order

By default the top resolution in the playlist is one of the middle resolutions. Many players will start with the first one listed so this default behavior strikes a balance by giving the player something that's not too low in terms of quality but also not too high in terms of bandwidth.

You may want to change this behavior by specifying `rendition_order=desc` which will sort the list of renditions from highest (highest quality, most bandwidth) to lowest (lowest quality, least bandwidth). Players that start with the first rendition in the list will now attempt to start playback with the highest resolution. The tradeoff is that users on slow connections will experience increaesed startup time.

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8?rendition_order=desc
```

# Usage with signed URLs

If you are using `signed` Playback IDs according to the [Secure video playback guide](/docs/guides/secure-video-playback) then your playback modifiers must be encoded in the `token` that you generate on your server. [See the modify playback behaviour guide](/docs/guides/modify-playback-behavior) about embedding extra params in your JWT.

# Using playback modifiers in Mux Player

Mux Player supports  `min_resolution`, `max_resolution` and `rendition_order` as attributes on the web component and props on the React component.

For example to set the `max_resolution=` parameter with Mux Player, you can set `max-resolution="720p"` attribute (`maxResolution="720p"` in React). When setting this attribute Mux Player will internally add it on as a query parameter on the streaming URL.

As with all playback modifiers, if you're using signed URLs, your parameters should be encoded in the `playback-token` attribute (`tokens.playback` in React).

# When using AVPlayer on iOS

Set the playback modifier by appending a `URLQueryItem` to the playback `URL`. [Initialize `AVPlayer` using the `URL` itself](https://developer.apple.com/documentation/avfoundation/avplayer/1385706-init) as shown in an example below using `max_resolution` or [initialize with an `AVPlayerItem` constructed with the URL](https://developer.apple.com/documentation/avfoundation/avplayer/1387104-init).

```objc


/// Creates a playback URL.
///
/// - Parameters:
///     - playbackID: playback ID for the asset.
///     - enableMaximumResolution: if true include a query parameter 
///     that sets a maximum resolution for the video.
///
/// - Returns: Playback URL with maximum resolution query param appended.
///
- (NSURL *)playbackURLWithPlaybackID:(NSString *)playbackID
           enableMaximumResolution:(BOOL)enableMaximumResolution {
  if (enableMaximumResolution) {
      NSURLComponents *components = [
          NSURLComponents componentsWithURL: [NSURL URLWithString: @"https://stream.mux.com/"]
          resolvingAgainstBaseURL: NO
      ];
      components.path = [NSString stringWithFormat: @"%@.m3u8", playbackID];

      NSURLQueryItem *queryItem = [
          NSURLQueryItem queryItemWithName: @"max_resolution"
          value: @"720p"
      ];
      components.queryItems = @[queryItem];

      return [components URL];
  } else {
      NSString *formattedURLString = [
          NSString stringWithFormat: @"https://stream.mux.com/%@.m3u8", playbackID
      ];
      return [NSURL URLWithString: formattedURLString];
  }
}

NSString *playbackID = @"qxb01i6T202018GFS02vp9RIe01icTcDCjVzQpmaB00CUisJ4";

NSURL *url = [self playbackURLWithPlaybackID: playbackID
                     enableMaximumResolution: NO];

AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithURL: url];
AVPlayer *player = [[AVPlayer alloc] initWithPlayerItem: playerItem];
 
```

```swift

import AVKit
import Foundation

let playbackID = "qxb01i6T202018GFS02vp9RIe01icTcDCjVzQpmaB00CUisJ4"

// Flag controlling if a max resolution is requested
let shouldLimitResolutionTo720p = true

let player = AVPlayer(
    using: playbackID,
    limitResolutionTo720p: shouldLimitResolutionTo720p
)

extension AVPlayer {
    /// Initializes a player configured to stream
    /// the provided asset's playback ID.
    /// - Parameters:
    ///   - playbackID: a playback ID of your asset
    ///   - limitResolutionTo720p: if true configures
    ///   the player to select a resolution no higher
    ///   than 720p. False by default. 
    convenience init(
        using playbackID: String,
        limitResolutionTo720p: Bool = false
    ) {
        let playbackURL = URL.makePlaybackURL(
            playbackID: playbackID,
            limitResolutionTo720p: limitResolutionTo720p
        )
        
        self.init(
            url: playbackURL
        )
    }
}

/// Convenience extensions for working with URLs
extension URL {
    /// Convenience initializer for a static URL
    /// - Parameters:
    ///   - staticString: a static representation
    ///   of a valid URL, supplying an invalid URL
    ///   results in precondition failure
    init(staticString: StaticString) {
        guard let url = URL(
            string: "\(staticString)"
        ) else {
            preconditionFailure("Invalid URL static string")
        }

        self = url
    }

    /// Convenience constructor for a playback URL with
    /// an optional 720p limit
    /// - Parameters:
    ///     - baseURL: either the Mux stream URL or can be
    ///     customized if using Custom Domains for Mux Video.
    ///     - playbackID: playback ID for the asset
    ///     - limitResolutionTo720p: set an upper threshold for the
    ///     resolution chosen by the player to 720p. By default no limit 
    ///     is set and the player can choose any available resolution.
    /// - Returns: a playback URL for a Mux Video asset with a resolution
    /// limit if it is requested
    static func makePlaybackURL(
        baseURL: StaticString = "https://stream.mux.com",
        playbackID: String,
        limitResolutionTo720p: Bool = false
     ) -> URL {
        var components = URLComponents(
            url: URL(
                staticString: baseURL
            ),
            resolvingAgainstBaseURL: false
        )

        components?.path = "/\(playbackID).m3u8"

        if limitResolutionTo720p {
            components?.queryItems = [
                URLQueryItem(
                    name: "max_resolution",
                    value: "720p"
                )
            ]
        }
    
        guard let playbackURL = components?.url else {
            preconditionFailure("Invalid playback URL component")
        }

        return playbackURL
     }
}

```



# Autoplay your videos
Use this guide to understand how to autoplay your videos on web-based players.
If you are autoplaying videos with any web based players that use the video element it will help to understand how browsers handle autoplay so that you can provide the best experience for your users. This applies to video elements with the autoplay attribute and anytime you are calling `play()` on a video element (this includes all HTML5 players like VideoJS, JWPlayer, Shaka player, etc.).

Browser vendors are frequently changing their policies when autoplay is allowed and not allowed, so your application should be prepared to deal with both scenarios, and we want to make sure we're tracking your views and errors accurately.

<Callout title="Autoplay with Mux Player">
  Mux Player has some extra options for helping with autoplay that do some of the following recommendations for you. Check out the [Mux Player autoplay guide](/docs/guides/player-core-functionality#autoplay) for details.
</Callout>

## Increase your chance of autoplay working

There's a few conditions that will increase your chance of autoplay working.

* Your video is muted with the muted attribute.
* The user has interacted with the page with a click or a tap.
* (Chrome - desktop) The user’s [Media Engagement Index](https://developers.google.com/web/updates/2017/09/autoplay-policy-changes#mei) threshold has been crossed. Chrome keeps track of how often a user consumes media on a site and if a user has played a lot of media on this site then Chrome will probably allow autoplay.
* (Chrome - mobile) The user has added the site to their home screen.
* (Safari) Device is not in power-saving mode.

<Callout type="error" title="Autoplay will never work 100% of the time">
  Even if autoplay works when you test it out, you can never rely on it working for every one of your users. Your application must be prepared for autoplay to fail.
</Callout>

## Avoid the `autoplay` attribute

When you use the `autoplay` attribute on a `<video>` element (it looks like `<video autoplay>`), you have no way to know if the browser blocked or didn't block autoplay.

It is recommended to use `video.play()` instead, which returns a promise and allows you to know if playback played successfully or not. If autoplay worked, the promise will resolve, if autoplay did not work then the promise will reject with an error. The great thing about this approach is that you can choose what to do with the error.

For example: you can report the error to your own error tracking tools or update the UI to reflect this error. Note that Mux's custom error tracking is for tracking fatal errors, so you wouldn't want to report an autoplay failure to Mux because then it will be considered a fatal error.

```js
const video = document.querySelector('#my-video');

video.play().then(function () {
  // autoplay was successful!
}).catch(function (error) {
  // do something if you want to handle or track this error
});
```

For further reading, see [the mux blog post](https://mux.com/blog/video-autoplay-considered-harmful/) about this topic.


# Use your own custom domain
Use your own domain for live streaming ingest and live and on-demand playback. For live streaming ingest we support CNAME-ing or "canonical naming" `global-live.mux.com`.  For playback, we support the streaming of videos or serving images from your own domain.
## Use your own domain name for live ingest

CNAME-ing, short for "Canonical Naming", is a configuration that allows you to change the default domain name we provide.

Add a [CNAME](https://en.wikipedia.org/wiki/CNAME_record) to your domain's [DNS](https://en.wikipedia.org/wiki/Domain_Name_System) settings, and configure it to point to `global-live.mux.com`. After a short amount of time you should be able use your own domain name for ingest.

<Callout type="warning" title="RTMP, RTMPS and SRT with custom domains">
  Mux supports both RTMP, RTMPS and SRT ingestion. RTMP and SRT support custom domains by configuring the CNAME record to point at the relevant ingest URL's domain (such as global-live.mux.com, or a [regional live ingest URL](/docs/guides/configure-broadcast-software#available-ingest-urls)). Custom domains will not work with RTMPS.

  Please reach out to [our support team](/support) with additional details of your requirements.
</Callout>

Here are a few popular domain services with CNAME-ing instructions. If your domain service is not listed, try searching their support resources.

* [Cloudflare](https://support.cloudflare.com/hc/en-us/articles/360020348832-Configuring-a-CNAME-setup)
* [Google Domains](https://support.google.com/a/answer/47283?hl=en)
* [AWS Route 53](https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#CNAMEFormat)
* [GoDaddy](https://www.godaddy.com/help/add-a-cname-record-19236)

Note that the CNAME doesn't have to be `global-live`, it can be anything you want it to be.

After configuring your DNS settings it may take a few hours before the new configuration works, depending on your DNS provider.

Here are a few examples of RTMPS and RTMP CNAME URLs before and after they are changed to custom domains:

```text
# RTMPS examples
rtmps://global-live.mux.com:443/app
# RTMP examples
rtmp://global-live.mux.com:5222/app
rtmp://your-cname.your-site.com:5222/app
```

## Use your own domain for delivering videos and images

For delivery, a custom domain allows you to play videos or deliver images from your domain rather than stream.mux.com or images.mux.com. Use your own domain for delivery, such as `media.mycustomdomain.com`

Why might you be interested in this feature? If you want to have a consistent brand presence across all your assets, sandbox your videos, or have a need to be allowlisted. If you are interested in this feature, please reach out to your Mux Account team.

## Availability

Custom domains for playback is available for our customers with an annual contract with Mux. If you do not have an annual contract with Mux you can add-on this feature for the price of $100 per month. Please reach out to our [Support Team](/support) to get set up.


# Embed videos for social media
Learn how to embed your videos using Open Graph cards
## Introduction to Open Graph

The Open Graph protocol is initialized by using HTML meta tags in the `<head>` section of a webpage, allowing you to define objects
from your webpage as thumbnails. These can be used in social media posts or appear in search results. Open Graph also helps search engines
find videos on your webpage that might be otherwise hidden due to JavaScript.

Here is a list of Open Graph properties for video optimization:

| Property        | Description                                            |
| :---------------| :----------------------------------------------------- |
| `og:type`         | The object’s type e.g video, audio                  |
| `og:url`          | The URL of the webpage                                 |
| `og:title`       | Title of the video                                     |
| `og:description`  | Description of the video                               |
| `og:image`        | Thumbnail of the video                                 |
| `og:video`        | The URL of the video                                   |
| `og:video:width`  | Width of the video in pixels                           |
| `og:video:height` | Height of the video in pixels                          |
| `og:site_name`    | The website name the contains the video                |

### Object types

You can also use sub types, for example, if your object type is video and you want to create a open graph card
for an episode or movie you can use video.episode or video.movie.

```html
<meta property="og:type" content="video.episode">
<meta property="og:type" content="video.movie">
<meta property="og:type" content="video.tv_show">
```

### Optional meta tags

Use additional properties to provide additional metadata about your object such as the actor and director.

| Property        | Description                                         |
| :----------- | :----------------------------------------------------- |
| `video:actor` | profile array - Actors in the movie. |
| `video:actor:role` | string - The role they played. |
| `video:director` | profile array - Directors of the movie. |
| `video:writer` | profile array - Writers of the movie. |
| `video:duration` | integer >=1 - The movie's length in seconds. |
| `video:release_date` | datetime - The date the movie was released. |
| `video:tag` | string array - Tag words associated with this movie. |

### Integrate the Open Graph meta tags

To add the Open Graph meta tags into your website, simply implement new meta tags in the `<head>` section of the webpage.
Below is an example of Open Graph tags:

```html
<!DOCTYPE html>
<html lang="en">
  <head>
    <meta property="og:title" content="Mux Video" />
    <meta property="og:type" content="video.episode" />
    <meta property="og:description" content="MP4 video asset for Open Graph Cards" />
    <meta property="og:image" content="https://image.mux.com/aYKMM7VxaD2InrbhrKlhi00V6R9EpRmQNmBJ10200AK02bE/thumbnail.png" />
    <meta property="og:video" content="https://stream.mux.com/F9cP5Xgdcp7028hN4gQrOmlF62ZDHNloCTQQao8Pk00kk/medium.mp4" />
    <meta property="og:video:width" content="350">
    <meta property="og:video:height" content="200">
    <meta property="og:video:duration" content="300">
    <meta property="og:url" content="http://mux.com">
  </head>
  <body>
    <video
      id="my-player"
      controls
      style="width: 100%; max-width: 500px;"
    />
  </body>
</html>
```

## Creating Twitter/X cards

Twitter cards are implemented using meta tags, but unlike Open Graph cards, they use different property names.

There are four different types of cards to choose from which is defined in the meta tag property twitter:card:

* photo card
* player card
* summary card
* app card

The player card provides functionality to play external media files inside of Twitter. Below are the definitions of other Twitter meta tags you can use with the player card.

### Twitter/X meta tags

| Property        | Description                                         |
| :----------- | :----------------------------------------------------- |
| `twitter:card` | Type of Twitter card e.g., "player." |
| `twitter:title` | The title of your content as it should appear in the card |
| `twitter:site` | The Twitter @username the card should be attributed to |
| `twitter:description` | Description of the content (optional) |
| `twitter:player` | HTTPS URL to I-frame player; this must be a HTTPS URL which does not generate active mixed content warnings in a web browser (URL to the page hosting the player) |
| `twitter:player:width` | Width of I-frame specified in twitter:player in pixels |
| `twitter:player:height` | Height of I-frame specified in twitter:player in pixels |
| `twitter:image` | Image to be displayed in place of the player on platforms that don’t support I-frame or inline players; you should make this image the same dimensions as your player |

### Example HTML

```html
<meta name="twitter:card" content="player" />
<meta name="twitter:title" content="Some great video" />
<meta name="twitter:site" content="@twitter_username">
<meta name="twitter:description" content="Great video by @twitter_username" />
<meta name="twitter:player" content="https://link-to-a-videoplayer.com" />
<meta name="twitter:player:width" content="360" />
<meta name="twitter:player:height" content="200" />
<meta name="twitter:image" content="https://link-to-a-image.com/image.jpg" />
```

## Preview your Open Graph cards

You can preview your Open Graph cards using any one of many services that allow you to simply enter
a URL that generates a preview.

One such service is [Opengraph.xyz](https://opengraph.xyz) that allows you to not only preview
what you have configured, but also helps you to generate more Open Graph meta tags.

Below is a preview of the example above in [Opengraph.xyz](https://opengraph.xyz)

<Image src="/docs/images/OpenGraph_preview.png" width={1790} height={491} alt="Preview Open Graph cards" />


# Use static MP4 and M4A renditions
Learn how to create downloadable MP4 and M4A files from your videos for offline playback, social media sharing, transcription services, and legacy device support.
<Callout type="info">
  This guide covers using `static_renditions` to create MP4 files, which replaces the deprecated `mp4_support` method. While `mp4_support` continues to function, we recommend using `static_renditions` for all new implementations.

  For details on the older method, see [enabling static mp4 renditions using mp4\_support](/docs/guides/enable-static-mp4-renditions-using-mp4-support).
</Callout>

## What are static MP4 and M4A renditions?

Static renditions are downloadable versions of your video assets in MPEG-4 video (`.mp4`) or audio (`.m4a`) format. These files are created alongside the default HLS streaming format and can be used for downloading or streaming the content.

Static renditions allow you to create downloadable files that can be used for:

* Supporting very old devices, like Android \< v4.0 (Less than 1% of Android users)
* Supporting assets that are very short in duration (e.g., \< 10s) on certain platforms
* Embedding a video in [Open Graph cards](/docs/guides/embed-videos-for-social-media) for sharing on sites like Facebook and Twitter
* Downloading videos for offline viewing

It also allows users to download M4A audio files, which may be useful for:

* Feeding into transcription services
* Delivering a streamable audio-only file to an audio element
* Downloading an audio-only file, useful for things like podcasts

In the majority of other cases, you'll want to use our default HLS (.m3u8) format, which provides a better viewing experience by dynamically adjusting the quality level to the viewer's connection speed.
The HLS version of a video will also be ready sooner than the MP4 versions, if time-to-ready is important.

## How video quality affects static renditions

Static renditions are created at the same quality level (Basic, Plus, or Premium) as your original Mux video, and will be the highest quality video rendition possible, a specific desired resolution, or an audio-only version. For videos with multiple versions at the highest resolution (which can happen with Premium quality), we'll use the highest quality version available.

## How to enable static renditions

There are several points in an asset's lifecycle where you can enable static renditions. You can enable them when initially creating an asset, add them later to an existing asset, or configure them as part of a direct upload. The method you choose will depend on your workflow and when you determine you need the static renditions.

### During asset creation

You can add static renditions to an asset when <ApiRefLink href="/docs/api-reference/video/assets/create-asset">creating an asset</ApiRefLink> by including the `"static_renditions": []` array parameter and specifying a `{ "resolution": <option> }` object as an array element for each static rendition that should be created.

There two types of static renditions: `standard` and `advanced`. Standard static rendition MP4s provide the most common options for needed for generating static renditions: either the highest video rendition possible, or an audio-only copy of the content. Advanced static rendition MP4s allow you to specify the specific resolution of the static renditions that are generated. Standard static renditions incur a cost for the number of minutes [stored](/docs/pricing/video#static-rendition-mp4s-storage) and [delivered](/docs/pricing/video#static-renditions-mp4s). Advanced static renditions also incur a [cost per minute of MP4s that are generated](/docs/pricing/video#advanced-static-rendition-mp4s-encoding).

#### Standard Static Rendition Options

The standard static rendtions options are:

* `highest`: Produces an MP4 file with the video resolution up to 4K (2160p).
* `audio-only`: Produces an M4A (audio-only MP4) file for a video asset.

One or both options can be specified.

* For an audio-only asset: The `audio-only` rendition option will produce an M4A file and `highest` is skipped.
* For a video-only asset: The `highest` rendition option will produce an MP4 file and `audio-only` is skipped.

Here's an example of creating an asset with `static_renditions` specified using the `highest` and `audio-only` options:

```json
// POST /video/v1/assets
{
  "inputs": [
    {
      "url": "https://storage.googleapis.com/muxdemofiles/mux.mp4"
    }
  ],
  "playback_policies": [
    "public"
  ],
  "static_renditions" : [ 
    {
      "resolution" : "highest"
    },
    {
      "resolution" : "audio-only"
    }
  ]
}
```

#### Advanced Static Rendition Options

The advanced supported resolutions are:

* 270p
* 360p
* 480p
* 540p
* 720p
* 1080p
* 1440p
* 2160p

Mux will not upscale to produce MP4 renditions - renditions that would cause upscaling are “skipped”.

Note that advanced explicit resolution static renditions cannot be mixed with the `highest` standard static rendition. However, they can be generated on the same asset as the `audio-only` rendition.

Here's an example of creating an asset with `static_renditions` specified using the `720p`, `480p`, and `audio-only` options:

```json
// POST /video/v1/assets
{
  "inputs": [
    {
      "url": "https://storage.googleapis.com/muxdemofiles/mux.mp4"
    }
  ],
  "playback_policies": [
    "public"
  ],
  "static_renditions" : [ 
    {
      "resolution" : "720p"
    },
    {
      "resolution" : "480p"
    },
    {
      "resolution" : "audio-only"
    }
  ]
}
```

### After asset creation

You can add static renditions to existing assets retroactively by calling the <ApiRefLink href="/docs/api-reference/video/assets/create-asset-static-rendition">create static rendition API</ApiRefLink>, as shown below. The create static rendition API will need to be called for each static rendition you would like to add to the asset.

```json
// POST /video/v1/assets/{ASSET_ID}/static-renditions
{
  "resolution" : "highest"
}
```

### During direct upload

To enable static renditions for direct upload, you need to specify the same `static_renditions` field within `new_asset_settings`, as shown below:

```json
// POST /video/v1/uploads
{
  "cors_origin": "https://example.com/",
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "static_renditions" : [ 
      {
        "resolution" : "highest"
      }
    ]
  }
}
```

### During live stream creation

Static renditions can be created from the recorded version of a live stream. This is useful if you want to create downloadable files from a live stream soon after the live stream is finished.

If you want to enable static renditions from the recorded version of a future live stream soon after the live stream is finished, use the `static_renditions` property in the `new_asset_settings` when <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">creating the live stream</ApiRefLink>.

```json
// POST /video/v1/live-streams
{
  "playback_policies": [
    "public"
  ],
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "static_renditions" : [
      {
        "resolution" : "highest"
      }
    ]
  }
}
```

### After live stream creation

To update the static renditions that are configured to be created from the recorded version of a future live stream, use the <ApiRefLink href="/docs/api-reference/video/live-streams/update-live-stream-new-asset-settings-static-renditions">update live stream static renditions API</ApiRefLink>..

```json
// PUT /video/v1/live-streams/{LIVE_STREAM_ID}/new-asset-settings/static-renditions
{
  "static_renditions" : [
    {
      "resolution" : "highest"
    },
    {
      "resolution" : "audio-only"
    }
  ]
}
```

## Access static renditions

After adding static renditions, you'll see an additional key on the asset object called `static_renditions`. This is the object that will contain the information about which static renditions are available.

```json
{
  ...all asset details...
  "static_renditions" : [
    {
      "id" : "ABC123",
      "type" : "standard",
      "status" : "preparing",
      "resolution" : "highest",
      "name" : "highest.mp4",
      "ext":"mp4"
    },
    {
      "id" : "GHI678",
      "type" : "standard",
      "status" : "preparing",
      "resolution" : "audio-only",
      "name" : "audio.m4a",
      "ext":"m4a"
    }
  ]
}
```

<Callout type="info" title="Static rendition status">
  Static renditions take longer to create than our default HLS version of the video, so they will not be ready immediately when the asset status is `ready`.

  The `static_renditions[].status` parameter refers to the current status of processing for each of the static renditions. Instead each static rendition will be ready when its `static_renditions[].status` is `ready`, and a `video.asset.static_rendition.ready` webhook is fired.
</Callout>

You can build the streaming URL by combining the playback ID with the name of the static rendition. The URL follows this pattern:

```html
https://stream.mux.com/{PLAYBACK_ID}/{STATIC_RENDITION_NAME}
```

The `name` field in each static rendition object (like `highest.mp4` or `audio.m4a`) is what you'll use as the `STATIC_RENDITION_NAME`.

```
ex. https://stream.mux.com/abcd1234/highest.mp4
ex. https://stream.mux.com/abcd1234/audio.m4a
```

If you want a browser to download the MP4 or M4A file rather than attempt to stream it, you can provide a file name for the static rendition to save it via the `download` query parameter:

```
https://stream.mux.com/{PLAYBACK_ID}/{STATIC_RENDITION_NAME}?download={SAVED_FILE_NAME}
```

For example, if you want to save the `highest.mp4` file as `cats.mp4`, you can use the following URL:

```
ex. https://stream.mux.com/abcd1234/highest.mp4?download=cats
```

### Accessing static renditions of DRM enabled assets

You can not access static renditions using the playback ID of a `drm` playback policy. If you want to use static renditions you must add a `public` or `signed` advanced playback policy alongside the `drm` policy.

```json
// POST /video/v1/assets
{
  "inputs": [
    {
      "url": "https://storage.googleapis.com/muxdemofiles/mux.mp4"
    }
  ],
  "advanced_playback_policies": [
    {
      "policy": "drm",
      "drm_configuration_id": "your-drm-configuration-id"
    }, 
    {
      "policy": "signed",
    }
  ],
  "static_renditions": [
    {
      "resolution": "highest"
    }
  ],
  "video_quality": "plus"
}
```

## Remove static renditions

### From an asset

To remove static renditions from an asset, you can use the <ApiRefLink href="/docs/api-reference/video/assets/delete-asset-static-rendition">delete static rendition API</ApiRefLink>. You call the delete static rendition API with the id for each rendition to remove from the asset. The static rendition files will be deleted when they are removed from an asset.

```json
// DELETE /video/v1/asset/{ASSET_ID}/static-renditions/{STATIC_RENDITION_ID}
```

To completely disable static renditions on an asset, delete all of the static renditions configured on the asset.

### From future live streams

To remove the static renditions that are configured to be created from the recorded version of a future live stream, use the <ApiRefLink href="/docs/api-reference/video/live-streams/delete-live-stream-new-asset-settings-static-renditions">delete live stream static renditions API</ApiRefLink>.

```javascript
// DELETE /video/v1/live-streams/{LIVE_STREAM_ID}/new-asset-settings/static-renditions
```

## Webhooks

Your application can be automatically updated with the status of static renditions for an asset through [webhooks](/docs/core/listen-for-webhooks).

There are five events you can receive, which can be fired for each individual static rendition.

| Webhook       | Description   |
| :------------ |:--------------|
|`video.asset.static_rendition.created` |Emitted when a static rendition entry is created and the file is being prepared. |
|`video.asset.static_rendition.ready` |Emitted when a static rendition is ready to be downloaded. |
|`video.asset.static_rendition.errored` |Emitted when a static rendition fails to be generated. |
|`video.asset.static_rendition.skipped` |Emitted when a static rendition is skipped because the requested resolution conflicts with the asset metadata. For example, specifying  `audio-only` for a video-only asset or  `highest` for an audio-only asset.
|`video.asset.static_rendition.deleted` |Emitted when an individual static rendition is deleted. Note: This event is not emitted when the parent asset is deleted. |

## Signed static rendition URLs

Mux videos have two types of playback policy, `public` or `signed`. If your `playback_id` is `signed`, you will need to also sign requests made for MP4 URLs.

You can check out how to do that in our [signed URLs guide](/docs/guides/secure-video-playback).

If you run into any trouble signing requests, please reach out to [Mux Support](/support) and we'll be able to help.

## Migrate from the deprecated `mp4_support` parameter

Previously, MP4 support was specified using the `mp4_support` parameter on an asset. This method continues to work though it has been deprecated and new functionality will use the `static_renditions` array.

The `mp4_support` parameter and the `static_renditions` array cannot be used at the same time on an asset.

To use the `static_renditions` array with assets that have MP4 support enabled using `mp4_support`, you need to first use the <ApiRefLink href="/docs/api-reference/video/assets/update-asset-mp4-support">update asset MP4 support APIs</ApiRefLink>, setting `mp4_support` to `none` to remove the `mp4_support`. Then you can create the static renditions individually as described above.

```json
// PUT /video/v1/assets/{ASSET_ID}/mp4-support
{
  "mp4_support": "none"
}
```

Similarly, the `mp4_support` parameter cannot be used if an asset has existing `static_renditions` specified. Delete the static renditions and the legacy `mp4_support` parameter can be enabled.

## Pricing

Additional storage fees apply for assets that have static renditions enabled.

Streaming of static renditions is charged at the same rate as HLS streaming.

[See pricing documentation for full details](/docs/pricing/video#static-renditions-mp4s)


# Download for offline editing
Learn how to download a master quality video for editing and post production or archiving
## Why download the master

When a video is ingested into Mux we store a version of the video that's equivalent in quality to the original video, we call this the master. The `max_resolution_tier` of an asset will determine the master file's resolution e.g. a `max_resolution_tier` of `2160p` will result in a 4K master file. All of the streamed versions of the video are created from the master, and the master itself is never streamed to a video player because it's not optimized for streaming.

There are a few common use cases where Mux may have the only copy of the original video:

* You're using Mux live streaming and the only copy is the recorded asset after the event
* You're using Mux's [direct upload](/docs/guides/upload-files-directly) feature so Mux has the only copy
* You deleted the original version from your own cloud storage because Mux is already storing a high quality version for you

When this is the case, there are a number of reasons you may want to retrieve the master version from Mux, including:

* Allowing users to download the video and edit it in a tool like Final Cut Pro
* Archiving the video for the future, for example if you're un-publishing (deleting) a video asset from Mux
* Moving your videos to another service

Enabling master access will create a *temporary* URL to the master version as an MP4 file. All MP4 master downloads include the video track and the primary audio track (if one exists) muxed into a single file.
You can use this URL to download the video to your own hosting, or provide the URL to a user to download directly from Mux.

**The URL will expire after 24 hours, but you can enable master access on any asset at any time.**

<Callout type="warning" title="API Only!">
  The methods described here are available only via the Mux API; you won't find these features in the Mux Dashboard.
</Callout>

## Enable master access

If you want the master be available soon after a video is uploaded, use the `master_access` property when <ApiRefLink href="/docs/api-reference/video/assets/create-asset">creating an asset</ApiRefLink>.

```json
{
  "inputs": [
    {
      "url": "VIDEO_URL"
    }
  ],
  "playback_policies": [
    "public"
  ],
  "video_quality": "basic",
  "master_access": "temporary"
}
```

You can also add it afterward by <ApiRefLink href="/docs/api-reference/video/assets/update-asset-master-access">updating the asset</ApiRefLink>.

### Enable master access when a live stream finishes

If you want to download the recorded version of a live stream soon after the live stream is finished, use the `master_access` property in the `new_asset_settings` when <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">creating the live stream</ApiRefLink>.

```json
{
  "playback_policies": [
    "public"
  ],
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "video_quality": "basic",
    "master_access": "temporary"
  }
}
```

## Retrieving the URL to the master

After master access has been requested, a new object called `master` will exist on the asset details object.

```json
{
  ...all asset details...
  "master_access": "temporary",
  "master": {
    "status": "preparing"
  }
}
```

Making the master available is an asynchronous process that happens after an Asset is `ready`, or in the case of a live streamed asset, after the `live_stream_completed` event. Because of this, the `master` object has a `status` property that gives the current state of the master, starting with `preparing`.

In most cases, the master will be available very quickly. When it's ready the `status` will be updated to `ready`, and a new `url` property will exist on the object. This is the URL you can use to download the master yourself, or to let a user download the master.

```json
{
  ...all asset details...
  "master_access": "temporary",
  "master": {
    "status": "ready",
    "url": "https://mezzanine.mux.com/ABC123/mezzanine.mp4?skid=foo&signature=bar"
  }
}
```

## Customizing the filename

The filename of the downloaded file can be controlled by appending the `download` query parameter to the URL returned from the API, for example:

```http
https://mezzanine.mux.com/ABC123/mezzanine.mp4?skid=foo&signature=bar&download=desired_filename.mp4
```

This will cause the browser to download the file with the name `desired_filename.mp4` instead of the default name.

<Callout type="warning" title="Important">
  It is important that you do not modify the URL or query parameters returned from the API in any other way than to add the `download` parameter.
</Callout>

## Webhooks for master access

Your application can be automatically updated with the status of master access for an asset through [webhooks](/docs/core/listen-for-webhooks).

There are four related events you can receive.

| Webhook       | Description   |
| :------------ |:--------------|
|`video.asset.master.preparing` | Received when master access is first requested |
|`video.asset.master.ready` |Received when the URL to the master is available |
|`video.asset.master.deleted` |Received if master access has been set to `none` via a PUT to the `master-access` endpoint |
|`video.asset.master.errored` |Received if an unexpected error happens while making the master available |


# Get thumbnails and images from a video
Learn how to get images from a video to show a preview thumbnail or poster image.
## Get an image from a video

To get an image from Mux, use a `playback_id` to make a request to `image.mux.com` in the following format:

```curl \[object Object]
https://image.mux.com/{PLAYBACK_ID}/thumbnail.{png|jpg|webp}
```

<Image src="/docs/images/thumbnail-1.png" width={1920} height={800} />

Images can be served in either `webp`, `png`, or `jpg` format. Webp is an image format that uses lossy and lossless image compression methods to reduce image size while maintaining good quality. If your images are in the `webp` format, they will typically yield a smaller file size compared to using `png` or `jpg` images.

You can control how the image is created by including the following query string parameters with your request. If you don't include any, Mux will default to choosing an image from the middle of your video.

### Thumbnail Query String Parameters

| Parameter     | Type          | Description                                           |
| :-------------|:--------------|:------------------------------------------------------|
| `time`        | `float`       | The time (in seconds) of the video timeline where the image should be pulled. Defaults to the middle of the original video.|
| `width`       | `int32`       | The width of the thumbnail (in pixels). Defaults to the width of the original video. |
| `height`      | `int32`       | The height of the thumbnail (in pixels). Defaults to the height of the original video. |
| `rotate`      | `int32`       | Rotate the image clockwise by the given number of degrees. Valid values are `90`, `180`, and `270`. |
| `fit_mode`    | `string`      | How to fit a thumbnail within width + height. Valid values are `preserve`, `stretch`, `crop`, `smartcrop`, and `pad` (see below). |
| `flip_v`      | `boolean`     | Flip the image top-bottom after performing all other transformations. |
| `flip_h`      | `boolean`     | Flip the image left-right after performing all other transformations. |
| `latest`      | `boolean`     | When set to `true`, pulls the latest thumbnail from the playback ID of an ongoing live stream. Can only be used with live streams. [More details](#getting-the-latest-thumbnail-from-a-live-stream). |

The `fit_mode` parameter can have the following values:

* `preserve` : By default, Mux will preserve the aspect ratio of the video, while fitting the image within the requested width and height. For example if the thumbnail width is 100, the height is 100, and the video's aspect ratio is 16:9, the delivered image will be 100x56 (16:9).
* `stretch` : The thumbnail will exactly fill the requested width and height, even if it distorts the image. Requires both width and height to be set. (Not very popular.)
* `crop` : The video image will be scaled up or down until it fills the requested width and height box. Pixels then outside of the box will be cropped off. The crop is always centered on the image. Requires both width and height to be set.
* `smartcrop` : An algorithm will attempt to find an area of interest in the image and center it within the crop, while fitting the requested width and height. Requires both width and height to be set.
* `pad` : Similar to preserve but Mux will "letterbox" or "pillar box" (add black padding to) the image to make it fit the requested width and height exactly. This is less efficient than preserve but allows for maintaining the aspect ratio while always getting thumbnails of the same size. Requires both width and height to be set.

#### Example with Query String Parameters

Here is an example request for an image including query parameters which:

* has a width of 400 and a height of 200
* uses the `smartcrop` fit mode
* is taken from the 35th second of the video
* is a PNG

```curl \[object Object]
https://image.mux.com/{PLAYBACK_ID}/thumbnail.png?width=400&height=200&fit_mode=smartcrop&time=35
```

<Image src="/docs/images/thumbnail-2.png" width={400} height={200} />

<Callout type="info">
  Note that there is a default limit of 1 thumbnail and 1 GIF for every 10 seconds of duration per asset. For assets under 100 seconds in duration, the limit is 10 thumbnails and 10 GIFs. For example, you can retrieve 30 thumbnails and 30 GIFs for a 5 minute asset or 10 thumbnails and 10 GIFs for a 30 second asset.
</Callout>

## Get an animated GIF from a video

To get an animated `gif` or `webp` from Mux, use a `playback_id` associated with an asset or live stream to make a request to `image.mux.com` in the following format:

```curl \[object Object]
https://image.mux.com/{PLAYBACK_ID}/animated.{gif|webp}
```

<Image src="/docs/images/animated-image-1.gif" width={640} height={266} />

You can control how the image is created by including the following query string parameters with your request.

### Animated GIF Query String Parameters

| Parameters    | Type          | Description   |
| :------------ |:--------------|:--------------|
| `start`       | `float`       | The time (in seconds) of the video timeline where the animated GIF should begin. Defaults to 0. |
| `end`         | `float`       | The time (in seconds) of the video timeline where the GIF ends. Defaults to 5 seconds after the start. Maximum total duration of GIF is limited to 10 seconds; minimum total duration of GIF is 250ms. |
| `width`       | `int32`       | The width in pixels of the animated GIF. Default is 320px, or if height is provided, the width is determined by preserving aspect ratio with the height. Max width is 640px. |
| `height`      | `int32`       | The height in pixels of the animated GIF. The default height is determined by preserving aspect ratio with the width provided. Maximum height is 640px. |
| `fps`         | `int32`       | The frame rate of the generated GIF. Defaults to 15 fps. Max 30 fps. |

#### Example with Query String Parameters

Here is an example request for a GIF including query parameters which:

* set a width of 640
* set a frame rate of 5fps

```curl \[object Object]
https://image.mux.com/{PLAYBACK_ID}/animated.gif?width=640&fps=5
```

<Image src="/docs/images/animated-image-2.gif" width={640} height={360} />

## Common uses for image requests

Images and GIFs can be used anywhere in your project, but here are some examples of common ways you can use images from Mux.

<Callout type="error">
  Avoid using images for storyboards (timeline hover previews). To learn more about storyboards, you can view [this guide](/docs/guides/create-timeline-hover-previews).
</Callout>

### Add a poster image to your player

Most video players will default to showing a black frame with a play icon and other video controls before a user presses play to start the video playback.
You can add a poster image to the majority of video players where you could feed an image URL from Mux. Here's an example using a HTML5 video element
with a poster image setup.

```jsx
<video id="my-video" width="640" height="360" poster="https://image.mux.com/{PLAYBACK_ID}/thumbnail.jpg" controls>
```

### Use a GIF to show a preview

When a user is picking a video from a catalogue, you could show a preview of the video using an animated GIF whilst they hover over a thumbnail of the video.

<Image src="/docs/images/animated-catalogue-example.gif" width={640} height={300} />

You could use pure CSS, some JavaScript, or another method which best fits with your application to achieve a similar result (the example above used CSS).

## Using signed URLs

Mux videos have two types of playback policy, `public` or `signed`. If your `playback_id` is `signed`, you will need to also sign requests made for images and animated GIFs.
You can check out how to do that in our [signed URLs guide](/docs/guides/secure-video-playback).

If you run into any trouble signing image requests, please [reach out](/support) and we'll be able to help.

## Getting the latest thumbnail from a live stream

When a live stream is active, you can use the `?latest=true` query string parameter to get the latest thumbnail from the live stream. This thumbnail is refreshed every 10 seconds.

This is useful for building moderation and classification workflows when working with user-generated live streams, but can also be used to show a discovery experience, showing the active live streams in your application.

Using the `latest` parameter on a VOD asset or non-live stream will return a 400 error.

[Read the blog post for more details end examples for this feature.](/blog/latest-thumbnail)


# Create timeline hover previews
Learn how to add hover image previews to your player.
## What are timeline hover previews?

Timeline hover previews, also known as trick play or scrub bar previews, make player operations like fast-forward, rewind, and seeking more visual to the user. Here it is in action:

<Image src="/docs/images/animated-storyboard.gif" width={640} height={360} />

Each image (also called a thumbnail or tile) you see when hovering over the scrub bar (or player timeline) on the video player is part of a larger image called a storyboard.
A storyboard is a collection of thumbnails or tiles, created from video frames selected at regular time intervals and are arranged in a grid layout.

Below image an example storyboard for the video, [Tears of Steel](https://mango.blender.org/), the same video used to demo timeline hover previews above:

<Image src="/docs/images/storyboard.png" width={1920} height={1600} />

## Add timeline hover previews to your player

There are a few different ways to add this functionality to your players, depending on which methods your chosen player exposes to support timeline hover previews.

The storyboard image can be requested from the following URL in either `webp`, `jpg`, or `png` format from Mux:

```curl \[object Object]
https://image.mux.com/{PLAYBACK_ID}/storyboard.{png|jpg|webp}
```

Each storyboard has an associated metadata file and can be used as a `metadata` text track. The storyboard image is referenced from the metadata in this case.

The storyboard metadata provides the x-axis and y-axis coordinates of each image in the storyboard image and the corresponding time range. The metadata is available in both WebVTT and JSON format.

Storyboard images will contain 50 tiles within the image if the asset is less than 15 minutes in duration. If the asset is more than 15 minutes, then there will be 100 tiles populated in the storyboard image.

### WebVTT

Most popular video players use WebVTT file for describing individual tiles of the storyboard image. You can request the WebVTT file by making a request to generate a storyboard of the image.

```curl \[object Object]
GET https://image.mux.com/{PLAYBACK_ID}/storyboard.vtt
```

```
WEBVTT

00:00:00.000 --> 00:01:06.067
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg#xywh=0,0,256,160

00:01:06.067 --> 00:02:14.067
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg#xywh=256,0,256,160

00:02:14.067 --> 00:03:22.067
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg#xywh=512,0,256,160

00:03:22.067 --> 00:04:28.067
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg#xywh=768,0,256,160

00:04:28.067 --> 00:05:36.067
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg#xywh=1024,0,256,160

00:05:36.067 --> 00:06:44.067
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg#xywh=0,160,256,160
```

By default, this request will generate a `jpg` for the storyboard image. If you'd like to change the format to `webp`, you can do so by adding `?format=webp` to the end of the request URL:

```curl \[object Object]
GET https://image.mux.com/{PLAYBACK_ID}/storyboard.vtt?format=webp
```

#### WebVTT Compatible Video Players

The list below shows the various video players supporting the WebVTT files for trick play. If your player isn't listed here, [please reach out](/support), and we'll help where we can!

* [VideoJS](https://videojs.com/) + [VTT Thumbnails plugin](https://www.npmjs.com/package/videojs-vtt-thumbnails)
* [JW Player](https://docs.jwplayer.com/players/docs/ios-add-preview-thumbnails)
* [THEOplayer](https://www.theoplayer.com/docs/theoplayer/how-to-guides/texttrack/how-to-implement-preview-thumbnails/)
* [Bitmovin](https://bitmovin.com/demos/thumbnail-seeking)
* [Flow Player](https://flowplayer.com/demos/video-thumbnails)
* [Plyr](https://plyr.io)

<Callout type="info">
  Using a WebVTT file may be limited to HTML5 browser-based video players and may not be supported in Device specific SDKs including iOS and Android. iOS, Android, and other device platforms use a HLS iFrame Playlist. Generating HLS iFrame Playlists is on Mux's roadmap.
</Callout>

### JSON

There are many other scenarios for using storyboards. For instance:

* A quick way of previewing the entire video can save the video editor/reviewer's time without requiring a full video playback
* Ease of developing trick play like functionality in Chromeless Video players like [hls.js](https://github.com/video-dev/hls.js/)

Using a WebVTT file for metadata can be burdensome to implement. Storyboard metadata expressed in an easy to understand & widely supported format like JSON helps in taking advantage of storyboards in new ways. Mux provides the same storyboard metadata in JSON format.

```curl \[object Object]
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.json
```

```json
{
  "url": "https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg",
  "tile_width": 256,
  "tile_height": 160,
  "duration": 6744.1,
  "tiles": [
    {
      "start": 0,
      "x": 0,
      "y": 0
    },
    {
      "start": 66.066667,
      "x": 256,
      "y": 0
    },
    {
      "start": 134.066667,
      "x": 512,
      "y": 0
    },
    {
      "start": 202.066667,
      "x": 768,
      "y": 0
    },
    {
      "start": 268.066667,
      "x": 1024,
      "y": 0
    },
    {
      "start": 336.066667,
      "x": 0,
      "y": 160
    }
  ]
}
```

By default, this request will generate a `jpg` for the storyboard image. If you'd like to change the format to `webp`, you can do so by adding `?format=webp` to the end of the request URL:

```curl \[object Object]
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.json?format=webp
```

## Cross-Origin Resource Sharing (CORS) requirements

The storyboards URLs use `image.mux.com` hostname and `stream.mux.com` hostname is used for video playback URL. Because the URLs use different hostnames, it is recommended to add `crossorigin` attribute to the [`<video>` HTML tag](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/video) for access.

## Roku trick play

Roku announced changes to their [channel certification criteria](https://blog.roku.com/developer/channel-certification-criteria-updates-october-2020) mandating trick play for on-demand video longer than 15mins starting October 1st, 2020.
To support this requirement, you can add this playback modifier for playback on Roku devices when making a playback request:

```none
https://stream.mux.com/{PLAYBACK_ID}.m3u8?roku_trick_play=true
```

Mux will include an Image Media Playlist in the HLS manifest to support this requirement on Roku.

<Callout type="warning" title="Using `roku_trick_play` with signed URLs">
  If you are using [signed playback URLs](/docs/guides/secure-video-playback) make sure you include the extra `roku_trick_play` in your signed token.
</Callout>

## Using signed URLs

Mux videos have two types of playback policy, `public` or `signed`. If your `playback_id` is `signed`, you will need to also sign requests made for storyboards.
You can check out how to do that in our [signed URLs guide](/docs/guides/secure-video-playback).


# Use Video.js with Mux
Learn what video.js kit is and how to use it in your application.
## 1. Introduction to video.js kit

Video.js kit is a project built on [Video.js](https://videojs.com) with additional Mux specific functionality built in.
This includes support for:

* Enabling [timeline hover previews](/docs/guides/create-timeline-hover-previews)
* [Mux Data integration](/docs/guides/monitor-video-js)
* `playback_id` helper (we'll figure out the full playback URL for you)

## 2. Integrate video.js kit

Video.js kit is hosted on npm. To install it, navigate to your project and run:

```text
// npm
npm install @mux/videojs-kit

// yarn
yarn add @mux/videojs-kit
```

Now import the JavaScript and CSS in your application like this:

```js
// include the video.js kit JavaScript and CSS
import videojs from '@mux/videojs-kit';
import '@mux/videojs-kit/dist/index.css';
```

If you're not using a package manager such as npm, there are hosted versions provided by [jsdelivr.com](https://www.jsdelivr.com/) available too.
Use the hosted versions by including this in your HTML page:

```html
// script
<script src="https://cdn.jsdelivr.net/npm/@mux/videojs-kit@latest/dist/index.js"></script>
// CSS
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/@mux/videojs-kit@latest/dist/index.css">
```

Then, on your page include a `<video>` element where you want to add your player.

```html
<video 
  id="my-player" 
  class="video-js vjs-16-9" 
  controls 
  preload="auto" 
  width="100%"
  data-setup='{}'
>
  <source src="{PLAYBACK_ID}" type="video/mux" />
</video>
```

Replace the \{PLAYBACK\_ID} with the `playback_id` of your video from Mux.

## Integrate using Video.js's default playback engine.

Video.js kit by default uses [hls.js](https://github.com/video-dev/hls.js).
As of version 0.8.0, you can now integrate with [Video.js's default playback engine](https://github.com/videojs/http-streaming).

To do so, you can follow the steps above but swap out the specific import file to be `index.vhs.js`.

For import:

```js
// include the video.js kit JavaScript and CSS
import videojs from '@mux/videojs-kit/dist/index.vhs.js';
```

For a script tag:

```html
<script src="https://unpkg.com/@mux/videojs-kit@latest/dist/index.vhs.js"></script>
```

## Source Code

Video.js kit is open source and can be found on GitHub here: [https://github.com/muxinc/videojs-mux-kit](https://github.com/muxinc/videojs-mux-kit)

## 3. Set configuration options

There are some built in additional features which can be set when you initialize video.js kit.

## Include a timeline hover preview

You can enable a timeline hover preview by including `timelineHoverPreviews: true` in the configuration options when you create your player.

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%"
  data-setup='{
    "timelineHoverPreviews": true
  }'
>
  <source src="{PLAYBACK_ID}" type="video/mux" />
</video>
```

## Enable Mux Data

You can enable Mux Data by including the following options when you create your player.

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%"
  data-setup='{
    "plugins": {
      "mux": {
        "debug": true,
        "data":{
          "env_key": "ENV_KEY",
          "video_title": "Example Title"
        }
      }
    }
  }'
>
  <source src="{PLAYBACK_ID}" type="video/mux" />
</video>
```

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

The `videojs-mux` data plugin is included by default, so you don't need to include this in addition to video.js kit. To link your data integration to your account,
you should replace the `ENV_KEY` in the configuration with an appropriate Mux environment key, as well as [set metadata](/docs/guides/monitor-video-js#3-make-your-data-actionable).

## Set the video source

You can set the video source using the `playback_id` from your video in Mux, and we'll figure out the fully formed playback URL automatically.

When you set the source for video.js, make sure you set the `type` as `video/mux` and the `src` as your `playback_id`.

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%">
  <source src="{PLAYBACK_ID}" type="video/mux" />
</video>

```

## Initialize with JavaScript

The options above can also be initialized with JavaScript. Here's an example showing how you could enable all options with JavaScript.

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%">
</video>

<script>
const player = videojs('my-player', {
  timelineHoverPreviews: true,
  plugins: {
    mux: {
      debug: false,
      data: {
        env_key: 'ENV_KEY',
        video_title: 'Example Title'
      }
    }
  }
});

player.src({
  src: "{PLAYBACK_ID}",
  type: "video/mux",
});
</script>
```

## 4. Using signed URLs

Playback of Mux videos with a signed playback policy is now supported from v0.3.0 onward.

Before continuing, ensure you have followed the [secure video playback](/docs/guides/secure-video-playback) guide, and are comfortable generating JSON Web Tokens (JWTs) to use with Mux.

## Playback a video with a signed URL

Use the `playback_id` from your video in Mux and then append your video JWT token
like this `{PLAYBACK_ID}?token={YOUR_VIDEO_JWT}` as your player source. When you set the source for video.js, make sure you set the `type` as `video/mux`.

In HTML:

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%" data-setup="{}">
  <source src="{PLAYBACK_ID}?token={JWT_VIDEO_TOKEN}" type="video/mux" />
</video>

```

Or, achieve the same result using JavaScript:

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%">
</video>

<script>
const player = videojs('my-player', {
  plugins: {
    mux: {
      debug: false,
      data: {
        env_key: 'ENV_KEY',
        video_title: 'Example Title'
      }
    }
  }
});

player.src({
  src: `{PLAYBACK_ID}?token={JWT_VIDEO_TOKEN}`,
  type: `video/mux`,
});

</script>
```

## Enable a timeline hover preview from a signed URL

Mux requires a separate JWT token to access the timeline hover preview storyboard URL, which isn't something that Video.js Kit is able to automatically figure out,
unlike with public playback URLs. Instead, we require the fully formed URL to be passed to the player.

To setup timeline hover previews with a signed URL, first make sure that `timelineHoverPreviews` is set to `false` or not set at all, which stops the automatic URL generation taking place.
Then, either set the `timelineHoverPreviewsUrl` in the player configuration like this:

```html
<video id="my-player" 
  class="video-js vjs-16-9" 
  controls 
  preload="auto" 
  width="100%" 
  data-setup='{
    "timelineHoverPreviewsUrl": "https://image.mux.com/{PLAYBACK_ID}/storyboard.vtt?token={JWT_STORYBOARD_TOKEN}"
  }'>
  <source src="{PLAYBACK_ID}?token={JWT_VIDEO_TOKEN}" type="video/mux" />
</video>

```

Or, achieve the same result using JavaScript and use the `player.timelineHoverPreviews()` function:

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%">
</video>

<script>
const player = videojs('my-player', {
  plugins: {
    mux: {
      debug: false,
      data: {
        env_key: 'ENV_KEY',
        video_title: 'Example Title'
      }
    }
  }
});

player.src({
  src: `{PLAYBACK_ID}?token={JWT_VIDEO_TOKEN}`,
  type: `video/mux`,
});

player.timelineHoverPreviews({
  enabled: true, 
  src: "https://image.mux.com/{PLAYBACK_ID}/storyboard.vtt?token={JWT_STORYBOARD_TOKEN}"
});

</script>
```

`player.timelineHoverPreviews` is a function that can be used to set, update or remove timeline hover previews from a player, and takes a single object parameter.

The object has two properties, `src` which should be a string pointing to the VTT file which contains the timeline hover previews information, and `enabled` which can be
either `true` for the player to attempt to use the provided source and setup the timeline hover previews, or `false` which will disable any timeline hover previews which are
currently configured on the player.

### Remove timeline hover previews

To switch off timeline hover previews, you can use the following API;

```js

player.timelineHoverPreviews({
  enabled: false, 
});
```

## 5. Enable a Quality Selector

As of [version v0.10.0](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.10.0), Video.js kit comes bundled with the \[`videojs-contrib-quality-levels`]\[] and \[`videojs-http-source-selector`]\[] plugins. They are not enabled by default.

To enable them, you can pass it in as part of the plugins object:

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%"
  data-setup='{
    "plugins": {
      "mux": {
        "debug": true,
        "data":{
          "env_key": "ENV_KEY",
          "video_title": "Example Title"
        }
      },
      "httpSourceSelector": {}
    }
  }'
>
  <source src="{PLAYBACK_ID}" type="video/mux" />
</video>
```

Or, you call the `httpSourceSelector` function manually on the player:

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%">
</video>

<script>
const player = videojs('my-player', {
  timelineHoverPreviews: true,
  plugins: {
    mux: {
      debug: false,
      data: {
        env_key: 'ENV_KEY',
        video_title: 'Example Title'
      }
    }
  }
});

player.src({
  src: "{PLAYBACK_ID}",
  type: "video/mux",
});

// enable the quality selector plugin
player.httpSourceSelector();
</script>
```

## 6. Configure webpack for other plugins

If you want to use another plugin but when you load up your page, but the plugin isn't loading up in Video.js, you'll need to configure Webpack, or another bundler specially.
This is due to the internals of how Video.js and Video.js kit are built. When using the default build, Video.js kit doesn't use the default Video.js built, but rather Video.js's `core.js` build. This means that Video.js plugins need to be configured use the same build.
This can be done with [Webpack's `resolve.alias` configuration](https://webpack.js.org/configuration/resolve/#resolvealias):

```js
config.resolve = {
  alias: {
    'video.js': 'video.js/core',
  }
};
```

## 7. Release notes

### Current release: v0.11.0

[View v0.11.0](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.11.0)

This release enables configuring hls.js via Video.js options:

```js
videojs('mux-default', {
  html5: {
    hls: {
      capLevelToPlayerSize: true
    }
  }
});
```

For advanced use cases, the `hlsjs` instance is exposed on the tech.

### Previous releases

#### v0.10.0

[View v0.10.0](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.10.0)

* This release includes \[`videojs-contrib-quality-levels`]\[] and \[`videojs-http-source-selector`]\[] by default.

#### v0.9.3

[View v0.9.3](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.9.3)

* Update Video.js version to v7.18.1.

#### v0.9.2

[View v0.9.2](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.9.2)

* Update hls.js to [v1.1.5](https://github.com/video-dev/hls.js/releases/tag/v1.1.5)

#### v0.9.1

[View v0.9.1](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.9.1)

* As part of 0.7.0, tighter error handling integration with hls.js made all errors be triggered on the player. This meant that errors that don't inhibit playback and that hls.js handled automatically were treated the same as fatal errors that hls.js doesn't handle automatically. Now, only errors that hls.js considers fatal will trigger an error event.

#### v0.9.0

[View v0.9.0](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.9.0)

* Integrate with [`videojs-contrib-quality-levels`](https://github.com/videojs/videojs-contrib-quality-levels).

#### v0.8.0

[View v0.8.0](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.8.0)

* Introduce VHS build file

Check the [GitHub releases](https://github.com/muxinc/videojs-mux-kit/releases) page for the full version history.

[videojs-http-source-selector]: https://github.com/jfujita/videojs-http-source-selector

[videojs-contrib-quality-levels]: https://github.com/videojs/videojs-contrib-quality-levels


# Modify playback behavior
Use playback modifiers in your HLS urls to change the default playback behavior.
Playback modifiers are optional parameters added to the video playback URL. These modifiers allow you to change the behavior of the stream you receive from Mux.

Mux Video supports 2 different types of playback policies: `public` and `signed`. Playback modifiers are supported for both types of playback policies. However, the method to add them differs.

# Query String with `public` playback URL

```text
https://stream.mux.com/{PLAYBACK_ID}.m3u8?{MODIFIER_NAME}={MODIFIER_VALUE}
```

Replace `PLAYBACK_ID` with your asset's public policy playback ID. Replace `MODIFIER_NAME` and `MODIFIER_VALUE` with any of the supported modifiers listed below in this document.

# JWT Claim with `signed` playback URL

```text
https://stream.mux.com/{PLAYBACK_ID}.m3u8?token={TOKEN}
```

Replace `PLAYBACK_ID` with your asset's signed policy playback ID and `TOKEN` with the signature generated. Add modifiers to your claims body in the JWT payload. View the guide for [Secure video playback](/docs/guides/secure-video-playback#note-on-query-parameters-after-signing) for details about adding query parameters to signed tokens.

# Availiable playback modifiers

| Modifier | Availiable Values | Default Value | Description |
| :-- | :-- | :-- | :-- |
| `redundant_streams` | `true`, `false` | `false` | Includes HLS redundant in the stream's manifest. See the [Play your videos](/docs/guides/play-your-videos#add-delivery-redundancy-with-redundant-streams) guide. |
| `roku_trick_play` | `true`, `false` | `false` | Adds support for timeline hover previews on Roku devices. See the [Create timeline hover previews](/docs/guides/create-timeline-hover-previews#roku-trick-play) guide.
| `default_subtitles_lang` | A BCP47 compliant language code | none | Sets which subtitles/captions language should be set as the default. See the [Subtitles guide](/docs/guides/add-subtitles-to-your-videos#showing-subtitles-by-default) guide.
| `max_resolution`| `270p`, `360p`, `480p`, `540p`, `720p`, `1080p`, `1440p`, `2160p` | none | Sets the maximum resolution of renditions included in the manifest. See the [Control playback resolution](/docs/guides/control-playback-resolution#specify-maximum-resolution) guide.|
| `min_resolution`| `270p`, `360p`, `480p`, `540p`, `720p`, `1080p`, `1440p`, `2160p` | none | Sets the minimum resolution of renditions included in the manifest. See the [Control playback resolution](/docs/guides/control-playback-resolution#specify-minimum-resolution) guide. |
| `rendition_order`| `desc` | Automatically ordered by Mux's internal logic. | Sets the logic to order renditions by in the HLS manifest. See [the blog post.](https://www.mux.com/blog/more-tools-to-control-playback-behavior-min-resolution-and-rendition-order#rendition_order)|
| `program_start_time` | An epoch timestamp | none | Sets the start time of the asset created from a live stream or live stream when using the [instant clipping feature](/docs/guides/create-instant-clips). |
| `program_end_time` | An epoch timestamp | none | Sets the end time of the asset created from a live stream or live stream when using the [instant clipping feature](/docs/guides/create-instant-clips). |
| `asset_start_time` | Time (in seconds) | none | Sets the relative start time of the asset when using the [instant clipping feature](/docs/guides/create-instant-clips). |
| `asset_end_time` | Time (in seconds) | none | Sets the relative end time of the asset when using the [instant clipping feature](/docs/guides/create-instant-clips). |


# Add metadata to your videos
Learn how to add titles and other metadata to your videos for better organization,
  discoverability, and actionable analytics.
## What is asset metadata?

Metadata provides additional descriptive information about your video assets. Mux currently supports three key optional metadata fields that help you organize and manage your video content across the API and dashboard:

* `title`: A descriptive name for your video content. We limit this to 512 code points.
* `creator_id`: A value you set to identify the creator or owner of the video. We limit this to 128 code points.
* `external_id`: Another value you set to reference this asset in your system, such as the video ID in your database. We limit this to 128 code points.

<Callout id="code-points" type="info">
  **What is a code point?** Many of us use the term "characters" when referring to letters in a string, but when storing those characters some cost more than others. This cost is called a "code point". <br /><br />While each ASCII character can be stored with a single code point, some unicode characters, such as `é`, are stored as two code points. One for the `e`, and one for the ` ́`. You can easily test this in JavaScript. JavaScript's `.length` property counts code points, not characters, so `"é".length` will be `2`.
</Callout>

Here's an example of what a `meta` object might look like:

```json
{
   "title": "Guide: Adding metadata to videos",
   "creator_id": "user_23456",
   "external_id": "cdef2345"
}
```

<Callout type="info">
  **Note:** Do not include personally identifiable information in these fields. They will be accessible by browsers to display player UI.
</Callout>

Once set on an asset, you'll find this metadata on assets across the Mux API and dashboard, including asset management, [engagement](/beta/engagement) and [data](/data).

## Manage metadata through the Dashboard

We've deeply integrated asset metadata throughout the Mux dashboard:

<Player playbackId="trRCuyNyUHeYdQ5ZbvSsRf34Reuc301CDQzAxDUqog1w" thumbnailTime="10" title="Asset metadata demo" className="flex" />

* When uploading, we use your filename as the title - but you can change it at any time
* For live streams, you can set the default metadata for recordings on the stream details page
* When viewers watch your content, all metadata flows into Mux Data and the engagement dashboard - making it easy to find videos by title, or filter by creator id.

## Manage metadata through the API

## Create an asset with metadata

When creating an asset you can include your metadata in the body of the request.

#### Example request

```json
// POST /video/v1/assets
{
    "inputs": [
        {
            "url": "https://storage.googleapis.com/muxdemofiles/mux.mp4"
        }
    ],
    "playback_policies": [
        "public"
    ],
    "video_quality": "basic",
    "meta": {
        "title": "Mux demo video",
        "creator_id": "abcd1234",
        "external_id": "bcde2345"
    }
}
```

#### Need more help?

* Check out our [getting started guide](/docs/core/stream-video-files) for a more thorough introduction to creating assets.
* Check out our <ApiRefLink href="/docs/api-reference/video/assets/create-asset">Create an asset</ApiRefLink> for a list of all possible parameters.

## Update the metadata on an asset

Once an asset has been created the metadata can be changed at any time. Make a request to update the asset and include your metadata in the request body.

#### Example request

```json
// PATCH /video/v1/assets/{ASSET_ID}
{
    "meta": {
        "title": "Updated Mux demo video",
        "creator_id": "cdef3456",
        "external_id": "defg4567"
    }
}
```

#### Need more help?

* Check out our <ApiRefLink href="/docs/api-reference/video/assets/update-asset">Update asset API reference</ApiRefLink> for more details.

## Directly upload a video with metadata

Direct uploads are a [multi-step process](/docs/guides/upload-files-directly), and metadata should be attached in the very first step. When creating your authenticated URL in that first step you can include your metadata alongside the rest of the asset settings in `new_asset_settings`.

#### Example Request

```json
// POST /video/v1/uploads
{
    "new_asset_settings": {
        "playback_policies": [
            "public"
        ],
        "video_quality": "basic",
        "meta": {
            "title": "Mux demo video",
            "creator_id": "abcd1234",
            "external_id": "bcde2345"
        }
    },
    "cors_origin": "*",
}
```

#### Need more help?

* Check out our [direct upload guide](/docs/guides/upload-files-directly) for details on every step.

## Set live stream metadata defaults for creating assets

Mux automatically creates a new asset each time you connect to a live stream. When creating or updating your live stream you can include metadata that gets automatically set on the generated assets in the request body, under `new_asset_settings`.

#### Example "Create Live Stream" request

```json
// POST /video/v1/live-streams
{
    "playback_policies": [
        "public"
    ],
    "new_asset_settings": {
        "playback_policies": [
            "public"
        ],
        // The values set here will be applied to all assets created from this stream.
        "meta": {
            "title": "Mux demo live stream recording",
            "creator_id": "abcd1234",
            "external_id": "bcde2345"
        }
    },
    // You can also add a title here, but it won't affect the assets. This is just for your reference in the API!
    "meta": {
        "title": "Demo stream",
    }
}
```

#### Need more help?

* Check out our "[start live streaming](/docs/guides/start-live-streaming)" guide for a deeper walkthrough.


# Add auto-generated captions to your videos and use transcripts
Learn how to add auto-generated captions to your on-demand Mux Video assets, to increase accessibility and to create transcripts for further processing.
## How auto-generated captions work

Mux uses [OpenAI's Whisper model](https://openai.com/index/whisper) to automatically generate captions for on-demand assets. This guide shows you how to enable this feature, what you can do with it, and what some of the limitations you might encounter are.

Generally, you should expect auto-generated captions to work well for content with reasonably clear audio. It may work less well with assets that contain a lot of non-speech audio (music, background noise, extended periods of silence).

We recommend that you try it out on some of your typical content, and see if the results meet your expectations.

This feature is designed to generate captions in the same language that your content's audio is produced in. It should not be used to programatically generate translated captions in other languages.

## Enable auto-generated captions

When you <ApiRefLink href="/docs/api-reference/video/assets/create-asset">create a Mux Asset</ApiRefLink>, you can add a `generated_subtitles` array to the API call, as follows:

```json
// POST /video/v1/assets
{
    "inputs": [
        {
            "url": "...",
            "generated_subtitles": [
                {
                    "language_code": "en",
                    "name": "English CC"
                }
            ]
        }
    ],
    "playback_policies": [
      "public"
    ],
    "video_quality": "basic"
}
```

Mux supports the following languages and corresponding language codes for VOD generated captions. Languages labeled as "beta" may have lower accuracy.

| Language | Language Code | Status |
| :-- | :-- | :-- |
| English | en | Stable |
| Spanish | es | Stable |
| Italian | it | Stable |
| Portuguese | pt | Stable |
| German | de | Stable |
| French | fr | Stable |
| Automatic Detection | auto | Stable |
| Polish | pl | Beta |
| Russian | ru | Beta |
| Dutch | nl | Beta |
| Catalan | ca |  Beta |
| Turkish | tr |  Beta |
| Swedish | sv | Beta |
| Ukrainian | uk | Beta |
| Norwegian | no | Beta |
| Finnish | fi | Beta |
| Slovak | sk | Beta |
| Greek | el | Beta |
| Czech | cs | Beta |
| Croatian | hr | Beta |
| Danish | da | Beta |
| Romanian | ro | Beta |
| Bulgarian | bg | Beta |

You can also enable autogenerated captions if you're <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">using Direct Uploads</ApiRefLink> by specifying the `generated_subtitles` configuration in the first entry of the `input` list of the `new_asset_settings` object, like this:

```json
// POST /video/v1/uploads
{
    "new_asset_settings": {
        "playback_policies": [
            "public"
        ],
        "video_quality": "basic",
        "inputs": [
            {
                "generated_subtitles": [
                    {
                        "language_code": "en",
                        "name": "English CC"
                    }
                ]
            }
        ]
    },
    "cors_origin": "*"
}
```

Auto-captioning happens separately from the initial asset ingest, so that this doesn't delay the asset being available for playback. If you want to know when the text track for the captions is ready, listen for the `video.asset.track.ready` webhook for a track with `"text_source": "generated_vod"`.

### Retroactively enable auto-generated captions

You can retroactively add captions to any asset by POSTing to the `generate-subtitles` endpoint on the asset audio track that you want to generate captions for, as shown below:

```json
// POST /video/v1/assets/${ASSET_ID}/tracks/${AUDIO_TRACK_ID}/generate-subtitles

{
  "generated_subtitles": [
    {
      "language_code": "en",
      "name": "English (generated)"
    }
  ]
}
```

**For self-service customers:** You can add captions to any asset using this API.

**For contract customers:** If you need to add captions to assets older than 7 days, [please contact support](/support/human) and we'd be happy to help. Please note that there may be a charge for backfilling captions onto large libraries.

## Retrieve a transcript

For assets that have a `ready` auto-generated captions track, you can also request a transcript (a plain text file) of the speech recognized in your asset.

To get this, use a playback id for your asset and the track id for the `generated_vod` text track:

<Callout>
  If you don't know the `TRACK_ID`, you can retrieve it by listing the asset's tracks using the{' '}
  <ApiRefLink href="/docs/api-reference/video/assets">Asset endpoint</ApiRefLink> under `tracks` and the corresponding `track.id`.
</Callout>

```
https://stream.mux.com/{PLAYBACK_ID}/text/{TRACK_ID}.txt
```

Signed assets require a `token` parameter specifying a JWT with the same `aud` claim used for [video playback](/docs/guides/secure-video-playback#4-generate-a-json-web-token-jwt):

```
https://stream.mux.com/{PLAYBACK_ID}/text/{TRACK_ID}.txt?token={JWT}
```

You can also retrieve a WebVTT version of the text track by replacing `.txt` with `.vtt` in the URL.

You might find this transcript useful for doing further processing in other systems. For example, content moderation, sentiment analysis, summarization, extracting insights from your content, and many more.

## FAQ

### How much does auto-generated captioning cost for on-demand assets?

There is no additional charge for this feature. It's included as part of the standard encoding and storage charges for Mux Video assets.

### How long does it take to generate captions?

It depends on the length of the asset, but generally it takes about 0.1x content duration. As an example, a 1 hour asset would take about 6 minutes to generate captions for.

### Help, the captions you generated are full of mistakes!

We're sorry to hear that! Unfortunately, though automatic speech recognition has improved enormously in recent years, sometimes it can still get things wrong.

One option you have is to edit and replace the mis-recognized speech in the captions track:

1. Download the full VTT file we generated at `https://stream.mux.com/{PLAYBACK_ID}/text/{TRACK_ID}.vtt`
2. Edit the VTT file using your preferred text editor
3. Delete the autogenerated track with the <ApiRefLink href="/docs/api-reference/video/assets/delete-asset-track">'delete track' API</ApiRefLink>
4. Add a new track to your asset using the edited VTT file, using the <ApiRefLink href="/docs/api-reference/video/assets/create-asset-track">`create track` API</ApiRefLink>

### My content is in multiple languages

We currently do not recommend using this feature on mixed-language content.

### I want to generate captions in a different language to my content

We currently do not support automatic translation in generated captions - you should only generate captions in the language that matches your audio track.

### My content is in a language you don't support

We'd love to hear more about the languages that you'd like to see us support, please [reach out](/support) with details.


# Add subtitles/captions to videos
Learn how to add subtitles or captions to your videos for accessibility and multi-language support.
## Introduction to subtitles and captions

Subtitles and captions allow for text overlays on a video to be shown at a specified time. First, let's clarify these two terms which are often used interchangeably.

* **Subtitles** refers to text on screen for translation purposes.
* **Captions** refers to text on screen for use by deaf and hard of hearing audiences. If you see text like `[crowd cheers]`, you are seeing *captions* on your screen.

In any case, Mux supports both in the form of [WebVTT](https://www.w3.org/TR/webvtt1/) or [SRT](https://en.wikipedia.org/wiki/SubRip) and these files can be human or computer generated. From Mux's perspective these files are converted into "text tracks" associated with the asset. If the text track provided is *captions* then supply the attribute `closed_captions: true` when creating the text track.

The rest of this guide will use the term "subtitles" to refer to adding text tracks that can be either subtitles or captions.

## How to add subtitles to your video

You can add subtitles to any video asset in Mux. To add subtitles, you will need to provide either a `SRT` or `WebVTT` file containing the subtitle information to the Mux API.

Here's an example of what a WebVTT file looks like:

```html
00:28.000 --> 00:30.000 position:90% align:right size:35%
...you have your robotics, and I
just want to be awesome in space.

00:31.000 --> 00:33.000 position:90% align:right size:35%
Why don't you just admit that
you're freaked out by my robot hand?
```

<Callout type="info">
  Mux can also [automatically generate your captions](/docs/guides/add-autogenerated-captions-and-use-transcripts)
</Callout>

## Create a subtitle track

When you <ApiRefLink href="/docs/api-reference/video/assets/create-asset">create an asset</ApiRefLink> in Mux, you can also include text tracks as part of the input. There's no limit on the number of tracks you can include when you make the request.

The first input in your array of inputs must be the video file. After that, the caption tracks should be appended to the list, each including the source URL to the caption track, plus additional metadata. Here's an example of the order to use here:

```json
{
    "inputs": [
      {
        "url": "{VIDEO_INPUT_URL}"
      },
      {
        "url": "https://tears-of-steel-subtitles.s3.amazonaws.com/tears-en.vtt",
        "type": "text",
        "text_type": "subtitles",
        "closed_captions": false,
        "language_code": "en",
        "name": "English"
      },
      {
        "url": "https://tears-of-steel-subtitles.s3.amazonaws.com/tears-fr.vtt",
        "type": "text",
        "text_type": "subtitles",
        "closed_captions": false,
        "language_code": "fr",
        "name": "Français"
      }
    ],
    "playback_policies": [
      "public"
    ],
    "video_quality": "basic"
}
```

This will enable WebVTT subtitles in the stream URL, which can then be used by many different players.

You can also add text tracks using the <ApiRefLink href="/docs/api-reference/video/assets/create-asset-track">create asset track</ApiRefLink>. This can be helpful for adding captions to live stream recordings once they have finished, or if you need to update or remove additional languages for a video after it was first added to Mux. Assets must be in the `ready` state before you can use the create asset track API to add a text track.

## Showing subtitles by default

To show subtitles by default, you can include an additional playback modifier with the HLS stream request like this:

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8?default_subtitles_lang=en
```

The `default_subtitles_lang` playback modifier requires a valid [BCP-47](https://tools.ietf.org/rfc/bcp/bcp47.txt) language value to set the DEFAULT attribute value to YES for that language.
If there's no exact language match, the closest match of the same language is selected.

For instance, subtitles text track with language `en-US` is selected for `default_subtitles_lang=en`. This helps with regional variations and gives more flexibility.

Video players play the default text track for autoplaying videos even when muted.

<Callout type="warning" title="Using `default_subtitles_lang` with signed URLs">
  If you are using [signed playback URLs](/docs/guides/secure-video-playback) make sure you include the extra parameter in your signed token.
</Callout>

## Accessibility

The [A11Y project](https://www.a11yproject.com/) is a community-driven effort to make digital accessibility easier and includes checking videos for accessibility.

With Mux videos, the `jsx-a11y/media-has-caption` rule fails because it looks for a `<track>` attribute on the player. However, Mux videos include subtitles with HLS manifest when you request the stream.
If you have added text tracks to your Mux videos you can safely disable this linting rule and still provide accessible video.

## Workflow for generating subtitles

You may want to generate subtitle tracks for your Mux assets. These might be machine generated or human-generated by yourself or a 3rd party. Some example third-party services you might use to do this are [Rev.com](https://www.rev.com/) and [Simon Says](https://www.simonsays.ai/).

<Callout type="info">
  Mux can also [automatically generate your captions](/docs/guides/add-autogenerated-captions-and-use-transcripts)
</Callout>

Using static renditions and webhooks from Mux, your automated flow might look like this:

1. Create a Mux asset (either with a Direct Upload, an `input` parameter, or the recording of a live stream).
2. Add `mp4_support` to your asset either at asset creation time or add `mp4_support` to your asset if it is already created. See [Download your videos guide](/docs/guides/download-your-videos) for details about how to do this.
3. Wait for the `video.asset.static_renditions.ready` webhook. This lets you know that the mp4 rendition(s) are now available.
4. Fire off a request to the 3rd party you are using for creating subtitles. You should pass along the mp4 file and get back either an SRT file or WebVTT file when the subtitle track is ready.
5. Wait for the subtitle track to be ready, when it is, make an API request to add this text track to your asset, as described above.


# Add alternate audio tracks to videos
Learn how to use multi-track audio to add alternate audio tracks to your videos
## Introduction to multi-track audio

The multi-track audio feature allows you to add alternate audio tracks to the video assets in your Mux account.

Videos with multi-track audio can be used for increased accessibility or multi-language support, or just to allow viewers to opt into a different audio experience, like a director's commentary.

## (Optional) Set the language and name for your primary audio track

<Callout type="info">
  Optional but highly recommended to increase accessibility if you're delivering alternate audio tracks.
</Callout>

When you <ApiRefLink href="/docs/api-reference/video/assets/create-asset">create an asset</ApiRefLink> in Mux, you can also specify the `language_code` and `name` of the primary audio track that's embedded in your first input file.

```json
// POST https://api.mux.com/video/assets

{
  "inputs": [
    {
      "url": "{VIDEO_INPUT_URL}",
      "language_code" : "en",
      "name" : "English"
    }
  ],
  "playback_policies": [
    "public"
  ],
  "video_quality": "basic"
}
```

A `name` is optional but highly recommended. If you don't specify it, we'll generate it for you based on the `language_code` you provided. The `language_code` must be a [BCP-47 language tag](https://en.wikipedia.org/wiki/IETF_language_tag), such as `en` for English, or `es` for Spanish. You can find a list of common [BCP-47 language tags here](https://en.wikipedia.org/wiki/IETF_language_tag#List_of_common_primary_language_subtags).

You can still use multi-track audio with assets that don't have a language or name set on your initial upload; we'll just call your primary audio track "Default," with no language.

## Add alternate audio tracks to your asset

Once you've created your asset with a primary audio track, you can add alternate audio tracks using the <ApiRefLink href="/docs/api-reference/video/assets/create-asset-track">create asset track API</ApiRefLink>, specifying the URL of the audio file you wish to add, and the `language_code` of the alternate audio track. This is the same API that you can use to add captions to your assets.

Mux supports most audio file formats and codecs, such as M4A, WAV, or MP3 file.  but for fastest processing, you should [use standard inputs wherever possible](/docs/guides/minimize-processing-time).

```json
// POST https://api.mux.com/video/assets/${ASSET_ID/tracks

{
  "url": "https://example.com/bar.m4a",
  "type": "audio",
  "language_code": "fr",
  "name": "Français"
}
```

Assets must be in the `ready` state before you can use the create asset track API to add the alternate audio track.

You always need to specify the `language_code` for an alternate audio track, but the `name` is optional. If you don't specify a `name`, we'll generate it for you based on the language code you provided.

You will need to call the API once for each alternate audio track that you want to add.

## Play your videos with multi-track audio

When the alternate audio track has been processed, Mux will automatically add it to the HLS playback URL for your asset.

Many video players already support multi-track audio right out of the box, including [Mux Player](/docs/guides/mux-player-web), Video.js, ExoPlayer, and AVPlayer. So just drop your usual playback URL into your favorite video player, and click play. If your player doesn't support multi-track audio, you'll just hear the primary audio track.

Switching between audio tracks differs in each video player, but this will usually be a menu on the bottom right allowing you to change the track. For example below in Mux Player, you need to click the waveform icon.

<Player playbackId="3x5wDUHxkd8NkEfspLUK3OpSQEJe3pom" thumbnailTime="0" title="Multi-track audio demo" />


# Introduction to video clipping
Overview of Mux's video clipping features, comparing instant clipping and asset-based clipping, with guidance on when to use each.
## Clipping video with Mux

Mux provides two approaches for creating clips from your video content: **Instant Clipping** and **Asset-Based Clipping**. Each method is designed for different use cases, offering flexibility depending on your needs for speed, accuracy, and workflow.

### Instant Clipping

[Instant Clipping](/docs/guides/create-instant-clips) allows you to create clips instantly by specifying start and end times directly in the playback URL (using query parameters or JWT claims). This approach does not require re-encoding or creating a new asset, so clips are available immediately and at no extra encoding cost. Instant clipping operates at the segment level, so clips may start or end a few seconds outside your requested range, depending on the stream's segment duration.

#### Key features:

* No additional encoding or asset creation required
* Clips are available instantly
* No extra cost for encoding or storage
* Segment-level accuracy (not frame-accurate)
* Ideal for live streams and quick highlight creation

**Learn more:** [Create instant clips](/docs/guides/create-instant-clips)

### Asset-Based Clipping

[Asset-Based Clipping](/docs/guides/create-clips-from-your-videos) creates a new, standalone asset from a portion of an existing video or live stream recording. This method involves a re-encoding process, resulting in a new asset with its own playback ID, and supports frame-accurate clipping. Asset-based clips incur encoding and storage costs and may take some time to process before they are ready for playback.

#### Key features:

* Creates a new asset with its own playback ID
* Frame-accurate clipping
* Supports additional features like watermarks and text tracks
* Incurs encoding and storage costs
* Suitable for polished, distributable clips or downloadable MP4s

**Learn more:** [Create asset-based clips](/docs/guides/create-clips-from-your-videos)

## Which clipping approach should you use?

Choose **Instant Clipping** if:

* You need clips to be available immediately
* You want to avoid extra encoding costs
* Segment-level accuracy is sufficient for your use case (e.g., live highlights, quick previews)
* You want to limit playback to a specific range without creating a new asset

Choose **Asset-Based Clipping** if:

* You require frame-accurate clips
* You need a new asset for distribution, download, or further processing
* You want to add watermarks or preserve text tracks in the clip
* You are willing to wait for processing and incur encoding/storage costs

| Feature | Instant Clipping | Asset-Based Clipping |
|---------|-----------------|---------------------|
| Availability | ✅ Immediate | ⏳ Requires processing |
| Frame Accuracy | ❌ Segment-level only | ✅ Frame-accurate |
| Additional Encoding Cost | ❌ No | ✅ Yes |
| Additional Storage Cost | ❌ No | ✅ Yes |
| Watermark Support | ❌ No | ✅ Yes |
| Text Track Support | ❌ No | ✅ Yes |
| Downloadable MP4s | ❌ No | ✅ Yes |
| Live Stream Support | ✅ Yes | ✅ Yes (recordings) |
| Unique Playback ID | ❌ No | ✅ Yes |

For a deeper dive into each approach, see the individual guides:

* [Create instant clips](/docs/guides/create-instant-clips)
* [Create asset-based clips](/docs/guides/create-clips-from-your-videos)


# Create instant clips
Learn how to create instant clips at no extra cost.
## Use cases for instant clipping

Instant clipping allows you to set the start and end times of the streaming URL to make clips that are instantly available without the wait time or expense of a new asset being created. This feature can be used to build a variety of viewer facing workflows.

<Callout type="info">
  If you require frame accurate clips, clipped masters, or clipped MP4s, you should use the [asset-based clipping feature](/docs/guides/create-clips-from-your-videos).
</Callout>

Here are examples of workflows that can be built with instant clipping:

### Pre-live workflows

Sometimes you need to connect your contribution encoder to a live stream and test that the video is working end-to-end before exposing the live stream to your audience. But when you have [DVR mode](/docs/guides/stream-recordings-of-live-streams) turned on for your stream, it's often necessary to prevent viewers being able to seek back into the parts of the live stream where your announcers are saying "testing, testing, 1… 2… 3…".

Instant clipping can be used to specify a start time to allow playback of a live stream, stopping users from seeking back into the stream beyond where you want. You can also specify an end time if you're worried about extra content at the end of your live events.

### Post-live trimming without re-encoding

With our [asset-based clipping feature](/docs/guides/create-clips-from-your-videos) you're able to create clipped on-demand assets, which are shortened versions of a given asset - this is commonly called "top and tail editing". These assets always incur an encoding cost to process the clipped version, and can take some time to process.

With instant clipping, for any asset generated from a live stream, you can simply specify the start and end times of the content you want clipped directly during playback without the need for time-consuming and costly re-processing.

For example, if you broadcast multiple sports events back-to-back on a single live stream, you can use instant clipping to generate instant on-demand streams of each match as it ends for no extra cost.

### Highlight clips

Sometimes a really exciting moment happens on a live stream, and you want to clip out a short highlight for others to enjoy. You can use instant clipping to pull out short clips from a currently active asset for promoting on your homepage or embedding into news articles.

This can be used for example to instantly show just the 90th-minute equalizer goal on your home page while having extra time and penalties to watch live on your pay-to-view platform.

## How instant clipping works

### From a live stream

Every live stream or asset generated from a live stream contains a timestamp that is close (usually within a second) to the time that Mux received the source video from the contribution encoder. This timestamp is known as ["Program Date Time"](https://www.mux.com/video-glossary/pdt-program-date-and-time) or "PDT" for short.

<Callout type="info">
  "PDT" has nothing to do with the Pacific Daylight time zone; all times are represented in UTC or with unix timestamps.
</Callout>

Instant clipping works by trimming the HLS manifests from live streams and VOD assets originating from live streams using these PDT timestamps, without re-encoding any segments. This means that instant clipping operates at the segment level of accuracy, so you should expect that the content that you clip out may be several seconds longer than you've requested. We always make sure to include the timestamps that you request, but your content may start a few seconds earlier, and end a few seconds later. The exact accuracy depends on the latency settings of the live stream that you're clipping from.

### From a VOD asset

Regardless if an asset has originated from a live stream or was uploaded, you can create instant clips using relative time markers for the start and end to generate the trimmed HLS manifest.  The relative time markers are based on the beginning of the asset and so specifying a range of `10` - `20` would result in a 10 second clip between `0:00:10` and `0:00:20`.

## Creating an instant clip URL

Instant clipping is controlled by passing [playback modifiers](/docs/guides/modify-playback-behavior) (query string arguments or JWT claims) to the playback URL of your live stream or VOD assets. If you're using signed URLs, these playback modifiers need to be embedded into your JWT.

### Live stream instant clips

While Mux timestamps video frames when they are received, there is a delay while enough frames are processed to form sufficient segments for a live stream to be started.

This means that you should expect some delay from wall-clock time to when you can use a given timestamp as a `program_start_time`.

For example, if a commentator presses a “Go Live” button at 13:00 UTC, which sets the `program_start_time` of a Live Stream to that timestamp, you should expect request for the live stream's manifest to respond with a HTTP 412 error for up to 15 seconds after (this will depend on the `latency_mode` of your live stream).

The start and end time of your trimmed live stream or on-demand asset are specified by using the following two parameters:

#### Using `program_start_time`

This parameter accepts an epoch time and can be set on a playback URL, and sets the start time of the content within the live stream or asset, for example:

```
# Format
https://stream.mux.com/${PLAYBACK_ID}.m3u8?program_start_time=${EPOCH_TIME}

# Example
https://stream.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq.m3u8?program_start_time=1707740400
```

When used on a live stream, this will cause the live stream to behave as if it is idle prior to this time.

When used on an asset, this will trim the start of the streamed media to this timestamp if needed.

#### Using `program_end_time`

This parameter accepts an epoch time and can be set on a playback URL, and sets the end time of the content within the live stream or asset, for example:

```
# Format
https://stream.mux.com/${PLAYBACK_ID}.m3u8?program_end_time=${EPOCH_TIME}

# Example
https://stream.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq.m3u8?program_end_time=1707740460
```

When used on a live stream, this will cause the live stream to behave as if it is idle after this time.

When used on an asset, this will trim the end of the streamed media to this timestamp.

#### Combining `program_start_time` and `program_end_time`

These parameters can be used together to extract a specific clip of a live stream or asset, for example:

```
# Format
https://stream.mux.com/${PLAYBACK_ID}.m3u8?program_start_time=${EPOCH_TIME}&program_end_time=${EPOCH_TIME}

# Example
https://stream.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq.m3u8?program_start_time=1707740400&program_end_time=1707740460
```

### VOD instant clips

The start and end time of your trimmed on-demand asset are specified by using the following two parameters:

#### Using `asset_start_time`

This parameter accepts relative time and can be set on a Playback URL, and sets the start time of the content within the asset, for example:

```
# Format
https://stream.mux.com/${PLAYBACK_ID}.m3u8?asset_start_time=${RELATIVE_TIME}

# Example
https://stream.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq.m3u8?asset_start_time=10
```

#### Using `asset_end_time`

This parameter accepts relative time and can be set on a Playback URL, and sets the end time of the content within the asset, for example:

```
# Format
https://stream.mux.com/${PLAYBACK_ID}.m3u8?asset_end_time=${RELATIVE_TIME}

# Example
https://stream.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq.m3u8?asset_end_time=20
```

#### Combining `asset_start_time` and `asset_end_time`

You can also use both of these parameters to create an instant clip of specific portion of your asset, for example:

```
# Format
https://stream.mux.com/${PLAYBACK_ID}.m3u8?asset_start_time=${RELATIVE_TIME}&asset_end_time=${RELATIVE_TIME}

# Example
https://stream.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq.m3u8?asset_start_time=10&asset_end_time=20
```

## Thumbnail & Storyboard support

### Images for VOD assets

To generate images for VOD assets, the `time` [query string parameter](/docs/guides/get-images-from-a-video#thumbnail-query-string-parameters) can be used to retrieve an image from the video, for example:

```
# Format
https://image.mux.com/${PLAYBACK_ID}/thumbnail.png?time=${RELATIVE_TIME}

# Example
https://image.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq/thumbnail.png?time=15
```

Storyboard generation for VOD assets support these parameters as a way to generate storyboard tiles for frames between the `asset_start_time` and `asset_end_time` values, for example:

```
#Format
https://image.mux.com/${PLAYBACK_ID}/storyboard.png?asset_start_time=${RELATIVE_TIME}&asset_end_time=${RELATIVE_TIME}

# Example
https://image.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq/storyboard.png?asset_start_time=10&asset_end_time=20
```

### Images for live streams

For thumbnails, you can now pass an absolute time using the `program_time` parameter, for example:

```
# Format
https://image.mux.com/${PLAYBACK_ID}/thumbnail.png?program_time=${EPOCH_TIME}

# Example
https://image.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq/thumbnail.png?program_time=1707740460
```

You can pass the same set of [playback modifiers](/docs/guides/modify-playback-behavior) (`program_start_time` and `program_end_time`) on a request for a storyboard and the storyboard will be trimmed appropriately, for example:

```
#Format
https://image.mux.com/${PLAYBACK_ID}/storyboard.png?program_start_time=${RELATIVE_TIME}&program_end_time=${RELATIVE_TIME}

# Example
https://image.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq/storyboard.png?program_start_time=1707740400&program_end_time=1707740460

```

## Using instant clipping in Mux Player

We've also made sure it's easy to pass these parameters to Mux Player when you're using it for playback.

Instant clipping is supported in Mux Player through two paths:

### Using Public Playback IDs: Via extra source params

<Callout type="info">
  This feature was added in mux-player 2.3.0, but we recommend using the latest version at all times.
</Callout>

Here's an example of using the extra source params for using the `asset_start_time` and `asset_end_time` parameters with mux-player for both video delivery and storyboards:

```html
<mux-player
  playback-id="sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq"
  extra-source-params="asset_start_time=10&asset_end_time=20"
  metadata-video-title="Instant clipping demo (Public)"
  storyboard-src="https://image.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq/storyboard.vtt?format=webp&asset_start_time=10&asset_end_time=20"
></mux-player>
```

Using the extra source params can also be used for instant clipping for live streams for video and storyboards as well:

```html
<mux-player
  playback-id="sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq"
  extra-source-params="program_start_time=1707740400&program_end_time=1707740460"
  metadata-video-title="Instant clipping demo (Public)"
  storyboard-src="https://image.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq/storyboard.vtt?format=webp&program_start_time=1707740400&program_end_time=1707740460"
></mux-player>
```

### Via signed URLs

When using signed URLs, it's required to include the clipping parameters as [claims inside the respective JWTs](/docs/guides/modify-playback-behavior#jwt-claim-with-signed-playback-url) passed to Mux Player.

For the playback token and the storyboard token, the following parameters should be injected into the JWT claims:

* `asset_start_time` and/or `asset_end_time`
* `program_start_time` and/or `program_end_time`

For the thumbnail token, the `program_time` parameter should be injected into the JWT claim.

Then Mux Player can be loaded in the usual way, passing in the signed tokens:

```html
<mux-player
  playback-id="s6oiUXJ6W1JH02D9ThJZQtyg74ubYTiT7"
  playback-token="${PLAYBACK_TOKEN}"
  storyboard-token="${STORYBOARD_TOKEN}"
  thumbnail-token="${THUMBNAIL_TOKEN}"
  metadata-video-title="Instant clipping demo (Signed)"
></mux-player>
```

## Stream security considerations

We strongly recommend using this feature alongside [signed URLs](/docs/guides/secure-video-playback). When using this feature without signed URLs, it is possible for users to manipulate the manifest playback URL to expose parts of the media that you want to keep hidden.

## Choosing between asset clipping and instant clipping

Not sure if you should be [generating a new asset](/docs/guides/create-clips-from-your-videos) when clipping, or using instant clipping for your workflow? Here are some tips that can help you choose the right approach for your product.

Instant clipping is a great choice when:

* You require a clip to be instantly available
* You need the clips to not incur additional encoding fees
* You need to pre-emptively limit the availability of content to build pre-live workflows for live streaming

You should use our [asset-based clipping](/docs/guides/create-clips-from-your-videos) when:

* You require frame accuracy in your clips
* You require trimmed [MP4s](/docs/guides/enable-static-mp4-renditions) or [masters](/docs/guides/download-for-offline-editing)


# Create clips from your videos
Learn how to create clips from your video files or live stream event recordings.
To drive higher viewer engagement with the videos already on your service, you can create additional videos from your existing library or catalog. These videos could:

* Provide quick previews
* Highlight key moments
* Be polished versions of a live stream with the extra minutes trimmed from the beginning & end (aka preroll and postroll slates) for on-demand replays

Mux can now help you quickly create these kinds of videos using the asset clipping functionality.

<Callout type="info">
  If you do not need frame accurate clips, or require immediate availability of clips, you may find that the [instant clipping feature may meet your requirements](/docs/guides/create-instant-clips).
</Callout>

## 1. Create a clip

When you [POST a new video](/docs/core/stream-video-files) or [start live streaming](/docs/guides/start-live-streaming), Mux creates a new asset for the video file or live stream event recording.
You can create a clip from an existing asset by making a <ApiRefLink href="/docs/api-reference/video/assets/create-asset">POST request to /assets endpoint</ApiRefLink> and defining the `input` object's clipping parameters.

* `url` is defined with `mux://assets/{asset_id}` template where `asset_id` is the source Asset Identifier to create the clip from.
* `start_time` is the time offset in seconds from the beginning of the video, indicating the clip's start marker. The default value is 0 when not included.
* `end_time` is the time offset in seconds from the beginning of the video, indicating the clip's end marker. The default value is the duration of the video when not included.

A request and response might look something like this:

### Example request

```bash
curl https://api.mux.com/video/v1/assets \
  -H "Content-Type: application/json" \
  -X POST \
  -d '{
        "inputs": [
          {
            "url": "mux://assets/01itgOBvgjAbES7Inwvu4kEBtsQ44HFL6",
            "start_time": 10.0,
            "end_time": 51.10
          }
        ],
        "playback_policies": [
          "public"
        ],
        "video_quality" : "basic"
      }' \
  -u ${MUX_TOKEN_ID}:${MUX_TOKEN_SECRET}
```

### Example response

```json
{
  "data": {
    "status": "preparing",
    "playback_ids": [
      {
        "policy": "public",
        "id": "TXjw00EgPBPS6acv7gBUEJ14PEr5XNWOe"
      }
    ],
    "mp4_support": "none",
    "master_access": "none",
    "id": "kcP3wS3pKcEPywS5zjJk7Q1Clu99SS1O",
    "created_at": "1607876845",
    "video_quality" : "basic",
    "source_asset_id": "01itgOBvgjAbES7Inwvu4kEBtsQ44HFL6"
  }
}
```

Mux creates a new asset for the clip. And the response will include an **Asset ID** and a **Playback ID**.

* Asset IDs are used to manage assets using `api.mux.com` (e.g. to read or delete an asset).
* <ApiRefLink href="/docs/api-reference/video/playback-id">Playback IDs</ApiRefLink> are used to stream an asset to a video player through `stream.mux.com`. You can add multiple playback IDs to an asset to create playback URLs with different viewing permissions, and you can delete playback IDs to remove access without deleting the asset.
* `source_asset_id` is the video or live stream event recording asset used to create the clip. The `source_asset_id` can be useful for associating clips with the source video object in your CMS.

## 2. Wait for "ready" event

When the clip is ready for playback, the asset "status" changes to "ready".

The best way to do this is via **webhooks**. Mux can send a webhook notification as soon as the asset is ready. See the [webhooks guide](/docs/core/listen-for-webhooks) for details.

If you can't use webhooks for some reason, you can manually **poll** the <ApiRefLink href="/docs/api-reference/video/assets/get-asset">asset API</ApiRefLink> to see asset status. Note that this only works at low volume.

### Build your own request

<CodeExamples product="video" example="retrieveAsset" />

Please don't poll this API more than once per second.

## 3. Play your clip

To play back the video, create a playback URL using the `PLAYBACK_ID` you received when you created the clip.

```curl
https://stream.mux.com/{PLAYBACK_ID}.m3u8
```

```android

implementation 'com.google.android.exoplayer:exoplayer-hls:2.X.X'

// Create a player instance.
SimpleExoPlayer player = new SimpleExoPlayer.Builder(context).build();
// Set the media item to be played.
player.setMediaItem(MediaItem.fromUri("https://stream.mux.com/{PLAYBACK_ID}.m3u8"));
// Prepare the player.
player.prepare();

```

```embed

<iframe
  src="https://player.mux.com/{PLAYBACK_ID}?metadata-video-title=Test%20video%20title&metadata-viewer-user-id=user-id-007"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html

<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>

<mux-player
  playback-id="{PLAYBACK_ID}"
  metadata-video-title="Test video title"
  metadata-viewer-user-id="user-id-007"
></mux-player>

```

```react

import MuxPlayer from '@mux/mux-player-react';

export default function VideoPlayer() {
  return (
    <MuxPlayer
      playbackId="{PLAYBACK_ID}"
      metadata={{
        video_id: "video-id-54321",
        video_title: "Test video title",
        viewer_user_id: "user-id-007",
      }}
    />
  );
}

```

```swift

import SwiftUI
import AVKit

private let playbackURL: URL = {
    guard let url = URL(string: "https://stream.mux.com/{PLAYBACK_ID}.m3u8") else {
        preconditionFailure("Invalid playback URL")
    }
    return url
}()

struct ContentView: View {
    private let player = AVPlayer(url: playbackURL)

    var body: some View {
        // VideoPlayer comes from SwiftUI.
        VideoPlayer(player: player)
            .onAppear {
                player.play()
            }
    }
}

```



See the [playback guide](/docs/guides/play-your-videos) for more information about how to integrate with a video player.

## FAQs

A few commonly asked questions:

### How many clips can be created from a single source asset?

Unlimited! Mux creates a new asset for each clip. Hence, there is no limit to how many clips you can create.

### Is there a cost to create clips?

Each clip is a new asset and is considered an on-demand video. On-Demand video pricing applies and that includes Encoding, Storage, and Delivery usage.

### Can I use basic video quality on clips?

Yes! Clips can be created as either `basic` or `plus`.

### Can I create clips when adding new video files?

Mux only allows creating clips from existing videos in your account. That means, clipping specific parameters (`start_time` and `end_time`) added to <ApiRefLink href="/docs/api-reference/video/assets/create-asset">Asset Creation</ApiRefLink> are only applicable for `input.url` with `mux://assets/{asset_id}` format.

### Can I create clips from live streams?

Yes! Mux supports creating clips from the active asset being generated by a live stream while broadcasting. If you clip an asset while the broadcast is active, just remember that the active asset is still growing, so if you don't provide `end_time`, it will default to the end of the asset at the time of creation. As such, when clipping an active asset during the broadcast, for best results you should always provide an `end_time`.

### My source asset has subtitles/captions text tracks. Will the clip have them?

Mux copies all the text tracks from the source asset to the new asset created for the clip. Mux also trims the text tracks to match the clip's start and end markers.

### What other data is copied from the source asset?

Mux copies the captions and watermark image from the source asset to the clips created. If your source asset does not have a watermark image and you want your clipped
asset to have a watermark, pass it through in `overlay_settings`. See more details in the [watermark guide](/docs/guides/add-watermarks-to-your-videos).

All other fields, such as `passthrough`, are not copied over.

### What is the minimum duration for a clip?

Clips must have a duration of at least 500 milliseconds.


# Add watermarks to your videos
This guide will show how to add watermarks (overlays) to your videos. Watermarks can be added to assets, live streams and direct uploads.
A watermark is an image overlaid on a video, often used to brand a video or visually label a specific version of a video.

<Image src="/docs/images/watermark-img.jpg" width={1920} height={1080} />

You can add a watermark to your video using the `overlay_settings` in the <ApiRefLink href="/docs/api-reference/video/assets/create-asset">asset creation API</ApiRefLink>. The first input in your array of inputs must be the video file you want to apply the watermark to, and the second should be the URL to the source watermark image along with placement details. Multiple watermarks are possible using additional inputs as described in our <ApiRefLink href="/docs/api-reference/video/assets/create-asset">API documentation for creating an asset</ApiRefLink>.

<Callout type="info">
  Valid file types for watermarks are `.png` and `.jpg`.

  Other file types such as `.gif`, `.webp`, and `.svg` are not supported at this time.
</Callout>

For a live stream, the `overlay_settings` must be embedded under the `input` array within `new_asset_settings` in the <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">live stream creation API</ApiRefLink>, and the overlays will apply both to playback through the live stream's playback IDs *and* all assets created from the live stream. The watermark image will be retrieved from this URL at the start of each live stream, so you should make sure that the image will be available at that URL for as long as you plan to use the live stream.

```asset

{
  "inputs": [
    {
      "url": "https://muxed.s3.amazonaws.com/leds.mp4"
    },
    {
      "url": "https://muxed.s3.amazonaws.com/example-watermark.png",
      "overlay_settings": {
        "vertical_align": "bottom",
        "vertical_margin": "2%",
        "horizontal_align": "right",
        "horizontal_margin": "2%",
        "width": "25%",
        "opacity": "90%"
      }
    }
  ],
  "playback_policies": ["public"],
  "video_quality": "basic"
}

```

```direct\_upload

{
  "cors_origin": "*",
  "new_asset_settings": {
    "playback_policies": ["public"],
    "video_quality": "basic",
    "inputs": [{
      "url": "https://muxed.s3.amazonaws.com/example-watermark.png",
      "overlay_settings": {
        "vertical_align": "bottom",
        "vertical_margin": "2%",
        "horizontal_align": "right",
        "horizontal_margin": "2%",
        "width": "25%",
        "opacity": "90%"
      }
    }]
  }
}

```

```live\_stream

{
  "playback_policies": [
    "public"
  ],
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "inputs": [
      {
        "url": "https://muxed.s3.amazonaws.com/example-watermark.png",
        "overlay_settings": {
          "vertical_align": "bottom",
          "vertical_margin": "2%",
          "horizontal_align": "right",
          "horizontal_margin": "2%",
          "width": "25%",
          "opacity": "90%"
        }
      }
    ]
  }
}

```



## Positioning with percents vs. pixels

The overlay settings are made to help you position and size a watermark consistently no matter what the size or shape of the input video. When setting the width, height, and margins you have the option of using either percents or pixels.

With percent values the watermark `width` and `horizontal_margin` will be relative to the width of the video while the `height` and `vertical_margin` will be relative to the height of the video. For example if you set the watermark `horizontal_margin` to 10% for a video that is 1920 pixels wide, the watermark will be 192 pixels from the edge.

```json
{
  "inputs": [
    {
      "url": "{VIDEO_INPUT_URL}"
    },
    {
      "url": "{WATERMARK_URL}",
      "overlay_settings": {
        "vertical_align": "top",
        "vertical_margin": "10%",
        "horizontal_align": "left",
        "horizontal_margin": "10%"
      }
    }
  ],
  "playback_policies": ["public"]
}
```

<Image src="/docs/images/watermark-percent.png" width={1280} height={1486} />

While the result of using percents is probably easiest to understand, the one shortcoming is positioning a watermark with an exact margin. For example you may want your horizontal and vertical margins to be equal, or for there to be the same exact horizontal margin for vertical videos as with horizontal videos. Both of those examples can be a challenge with percents, where the actual result can be different depending on the width and height of the video.

Setting margins with pixels allows you to get exact with your margins, widths, and heights. However, you can't always control the size of the input video, and a watermark that is 80px wide would look very different on a video that is 960 pixels wide compared to a video that is 1920 pixels wide. For that reason, when you use pixel values in your overlay settings they will always be applied as if the video is first scaled to fit 1920x1080 for horizontal videos or 1080x1920 for vertical videos. So in the previous example, the watermark would be 80px wide on the 1920px wide video, and 40px wide on the 960px wide video.

```json
{
  "inputs": [
    {
      "url": "{INPUT_URL}"
    },
    {
      "url": "{WATERMARK_URL}",
      "overlay_settings": {
        "width": "80px",
        "vertical_align": "top",
        "vertical_margin": "40px",
        "horizontal_align": "left",
        "horizontal_margin": "40px"
      }
    }
  ],
  "playback_policies": ["public"]
}
```

<Image src="/docs/images/watermark-pixel.png" width={1330} height={1550} />

The reason behind this is that your watermark should look the same no matter what the original size of the input video, and videos are most often scaled to fit the player window or the screen of the device.

## Center a watermark

<Image src="/docs/images/watermark-center.jpg" width={1920} height={1080} />

To center a watermark on the video, simply set `vertical_align` to "middle" and `horizontal_align` to "center".

```json
{
  "inputs": [
    {
      "url": "{INPUT_URL}"
    },
    {
      "url": "{WATERMARK_URL}",
      "overlay_settings": {
        "vertical_align": "middle",
        "horizontal_align": "center"
      }
    }
  ],
  "playback_policies": ["public"]
}
```


# Adjust audio levels
This guide will show how to adjust the audio level to your videos. Audio normalization can be added to on-demand assets.
## What is audio normalization?

Here at Mux, When we refer to audio normalization, we are referring to loudness normalization. Loudness normalization adjusts the recording based on perceived loudness.

Below, is an audio stream *before* normalization

<Image src="/docs/images/audio-norm-before.png" width={640} height={200} alt="Audio norm before" />

An audio stream *after* normalization

<Image src="/docs/images/audio-norm-after.png" width={640} height={200} alt="Audio norm after" />

LUFS, which stands for Loudness Units relative to Full Scale, are a measurement of loudness over the entire length of an audio stream. This is the measurement used in the normalization process.
The whole goal of normalizing is attaining the gain to bring the average amplitude to a target level; the "norm".

## When to use audio normalization

The main use of audio normalization is to standardize the perceived loudness of your assets. Whether to use normalization at all depends on the content.
When audio gain is normal and audio quality is high, normalization can be beneficial. Please note however, similar to other video and audio processing, this processing on your audio is going to change it some way.
So make an informed decision on whether to use this feature or not.

## How to turn on audio normalization

At this moment, the only way to enable audio normalization on a Mux asset is through the <ApiRefLink href="/docs/api-reference/video/assets/create-asset">create asset endpoint.</ApiRefLink> You cannot update this after the asset has been created. This option also only applies to on-demand assets (audio-only included) but not live streams.

To enable audio normalization on your asset ingest, set the `normalize_audio` key to `true` in the body of your asset creation. By default, this boolean is set to false.

A typical request and response might look something like this:

### Example request

```bash
curl https://api.mux.com/video/v1/assets \
  -H "Content-Type: application/json" \
  -X POST \
  -d '{
        "inputs": [
          {
            "url": "https://example.com/myVideo.mp4"
          }
        ],
        "playback_policies": ["public"],
        "video_quality": "basic"
        "normalize_audio": true 
    }' \
  -u ${MUX_TOKEN_ID}:${MUX_TOKEN_SECRET}
```

### Example response

```json
{
    "data": {
        "status": "preparing",
        "playback_ids": [
            {
                "policy": "public",
                "id": "006Hx6bozgZv2sL9700Y8TT02MKdw4nq01ipMVawIGV9j000"
            }
        ],
        "normalize_audio": true,
        "mp4_support": "none",
        "master_access": "none",
        "id": "jlJydoVkYh01Z3JrLr02RGcp4mJdLvPRbk9n00000",
        "video_quality": "basic",
        "created_at": "1612979762"
    }
}
```

## Target loudness

Our target loudness value for audio normalization in our video stack is currently –24 LUFS. So, if possible, master your audio with this value in mind.


# Start live streaming
In this guide you will learn how to build a live streaming platform with Mux live streaming.
Whether you’re looking to build “Twitch for X”, online classrooms, a news & sports broadcasting platform or something the world’s never seen before, the Mux Live Streaming API  makes it easy to build live video into your own software. With a simple API call you get everything you need to push a live stream and play it back at high quality for a global audience.

<Image src="/docs/images/live-streaming-overview-2.png" width={1798} height={1040} />

## 1. Get an API Access Token

<Callout type="info">
  For a guided example of how to make API Requests from your local environment, see the guide and watch this video tutorial: [ Make API Requests](/docs/core/make-api-requests).
</Callout>

The Mux Video API uses a token key pair that consists of a **Token ID** and **Token Secret** for authentication. If you haven't already, generate a new Access Token in the [Access Token settings](https://dashboard.mux.com/settings/access-tokens) of your Mux account dashboard.

<Image src="/docs/images/settings-api-access-tokens.png" width={500} height={500} alt="Mux access token settings" />

The access token should have Mux Video **Read** and **Write** permissions.

<Image src="/docs/images/new-access-token.png" width={760} height={376} alt="Mux Video access token permissions" sm />

Access Tokens also belong to an Environment. Be sure to use the same Environment when using Mux Video and Mux Data together, so the data from Mux Data can be used to optimize your Mux Video streams.

## 2. Create a unique Live Stream

<ApiRefLink href="/docs/api-reference/video/live-streams">Detailed API reference</ApiRefLink>

The Live Stream object in the Mux API is a record of a live stream of video that will be pushed to Mux. To create your first Live Stream, <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">POST request to the /live-streams endpoint</ApiRefLink>.

You can either replace `${MUX_TOKEN_ID}` and `${MUX_TOKEN_SECRET}` with your own access token details or make sure to export those environment variables with the correct values first.

```curl

curl https://api.mux.com/video/v1/live-streams \
  -H "Content-Type: application/json" \
  -X POST \
  -d '{ "playback_policies": ["public"], "new_asset_settings": { "playback_policies": ["public"] } }' \
  -u ${MUX_TOKEN_ID}:${MUX_TOKEN_SECRET}

```

```elixir

# config/dev.exs
config :mux,
  access_token_id: "MUX_TOKEN_ID",
  access_token_secret: "MUX_TOKEN_SECRET"

client = Mux.client()
{:ok, live_stream, _env} = Mux.Video.LiveStreams.create(client, %{playback_policy: "public", new_asset_settings: %{playback_policy: "public"}});

```

```go

import (
    muxgo "github.com/muxinc/mux-go"
)

client := muxgo.NewAPIClient(
    muxgo.NewConfiguration(
        muxgo.WithBasicAuth(os.Getenv("MUX_TOKEN_ID"), os.Getenv("MUX_TOKEN_SECRET")),
    ))

createAsset := muxgo.CreateAssetRequest{PlaybackPolicy: []muxgo.PlaybackPolicy{muxgo.PUBLIC}}
createLiveStream := muxgo.CreateLiveStreamRequest{NewAssetSettings: createAsset, PlaybackPolicy: []muxgo.PlaybackPolicy{muxgo.PUBLIC}}
live_stream, err := client.LiveStreamsApi.CreateLiveStream(createLiveStream)

```

```node

import Mux from '@mux/mux-node';
const mux = new Mux({
  tokenId: process.env.MUX_TOKEN_ID,
  tokenSecret: process.env.MUX_TOKEN_SECRET
});

await mux.video.liveStreams.create({
  playback_policy: ['public'],
  new_asset_settings: { playback_policy: ['public'] },
});

```

```php

$config = MuxPhp\Configuration::getDefaultConfiguration()
  ->setUsername(getenv('MUX_TOKEN_ID'))
  ->setPassword(getenv('MUX_TOKEN_SECRET'));

$liveApi = new MuxPhp\Api\LiveStreamsApi(
    new GuzzleHttp\Client(),
    $config
);

$createAssetRequest = new MuxPhp\Models\CreateAssetRequest(["playback_policy" => [MuxPhp\Models\PlaybackPolicy::_PUBLIC]]);
$createLiveStreamRequest = new MuxPhp\Models\CreateLiveStreamRequest(["playback_policy" => [MuxPhp\Models\PlaybackPolicy::_PUBLIC], "new_asset_settings" => $createAssetRequest]);
$stream = $liveApi->createLiveStream($createLiveStreamRequest);

```

```python

import mux_python

configuration = mux_python.Configuration()
configuration.username = os.environ['MUX_TOKEN_ID']
configuration.password = os.environ['MUX_TOKEN_SECRET']

live_api = mux_python.LiveStreamsApi(mux_python.ApiClient(configuration))
new_asset_settings = mux_python.CreateAssetRequest(playback_policy=[mux_python.PlaybackPolicy.PUBLIC])
create_live_stream_request = mux_python.CreateLiveStreamRequest(playback_policy=[mux_python.PlaybackPolicy.PUBLIC], new_asset_settings=new_asset_settings)
create_live_stream_response = live_api.create_live_stream(create_live_stream_request)

```

```ruby

MuxRuby.configure do |config|
  config.username = ENV['MUX_TOKEN_ID']
  config.password = ENV['MUX_TOKEN_SECRET']
end

create_asset_request = MuxRuby::CreateAssetRequest.new
create_asset_request.playback_policy = [MuxRuby::PlaybackPolicy::PUBLIC]
create_live_stream_request = MuxRuby::CreateLiveStreamRequest.new
create_live_stream_request.new_asset_settings = create_asset_request
create_live_stream_request.playback_policy = [MuxRuby::PlaybackPolicy::PUBLIC]
create_live_stream_request.latency_mode = "reduced"

```



The response will include a **Playback ID** and a **Stream Key**.

* <ApiRefLink href="/docs/api-reference/video/assets/get-asset-playback-id">Playback IDs</ApiRefLink> for a Live Stream can be used the same way as Playback IDs for an Asset. You can use it to [play video](/docs/guides/play-your-videos), [get images from a video](/docs/guides/get-images-from-a-video) or [build timeline hover previews with your player](/docs/guides/create-timeline-hover-previews).
* The **Stream Key** is a **secret** that can be used along with Mux's RTMP Server URL (see table below) to configure RTMP streaming software.

<Callout type="warning" title="Important">
  The *Stream Key* should be treated as a **private key for live streaming**. Anyone with the key can use it to stream video to the Live Stream it belongs to, so make sure your users know to keep it safe. If you lose control of a stream key, you can either delete the Live Stream or <ApiRefLink href="/docs/api-reference/video/live-streams/reset-stream-key">reset the stream key</ApiRefLink>
</Callout>

```json
{
  "data": {
    "id": "QrikEQpEXp3RvklQSHyHSYOakQkXlRId",
    "stream_key": "super-secret-stream-key",
    "status": "idle",
    "playback_ids": [
      {
        "policy": "public",
        "id": "OJxPwQuByldIr02VfoXDdX6Ynl01MTgC8w02"
      }
    ],
    "created_at": "1527110899"
  }
}
```

Mux also allows you to set a few additional options on your live stream. When enabled, you can support more use cases.

| Option | Description |
|--------|-------------|
| `"latency_mode": "reduced"` | Mux live streams have an option for "reduced latency". When `"latency_mode": "reduced"` is enabled, we treat your stream a little differently to minimize glass-to-glass latency. The latency reduces to about 10-15 seconds compared to 30 seconds typically without enabling this option. For more details, please refer to the [Live Stream Latency guide](/docs/guides/reduce-live-stream-latency). |
| `"latency_mode": "low"` | Similar to `"reduced"` latency option, `"latency_mode": "low"` live streams reduce the glass-to-glass latency to as low as 5 seconds but the latency can vary depending on your viewer's geographical location and internet connectivity. For more details, please refer to the [Live Stream Latency guide](/docs/guides/reduce-live-stream-latency). |
| `audio_only` | Mux live streams is ready for Audio specific use cases too. For example, you can host Live Podcasts or broadcast Radio Shows. When `audio_only` is enabled, we only process the audio track, even dropping the video track if broadcast. |

<Callout type="warning">
  A live stream can only be configured as "reduced latency" or "low latency" or standard latency.
</Callout>

You can find more details about the options on the <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">Create Live Stream</ApiRefLink>.

## 3. Start your broadcast

Mux supports live streaming using the RTMP protocol, which is supported by most broadcast software/hardware as well as open source software for mobile applications.

Your users or your client app will need software that can push an RTMP stream. That software will be configured using the **Stream Key** from the prior step along with Mux's RTMP Server URL. Mux supports both RTMP and RTMPS:

| RTMP Server URL |  Description | Common Applications |
|---|---|---|
| rtmp://global-live.mux.com:5222/app | Mux's standard RTMP entry point. Compatible with the majority of streaming applications and services. |Open Source RTMP SDKs, most [app-store streaming applications](https://mux.com/blog/guide-to-rtmp-broadcast-apps-for-ios/). |
| rtmps://global-live.mux.com:443/app | Mux's secure RTMPS entry point. Compatible with less streaming applications, but offers a higher level of security. | OBS, Wirecast, Streamaxia RTMP SDKs |

Learn more about:

* [Additional regional ingest URLs](/docs/guides/configure-broadcast-software#available-ingest-urls) for when you want control over the geographic region receiving your user's livestream
* [How to configure broadcast software](/docs/guides/configure-broadcast-software) for when users will be using their own streaming software, e.g. Twitch live streamers
* [How to live stream from a mobile app](/docs/guides/live-streaming-from-your-app) for when users will live stream using your mobile app

<Callout type="warning" title="Important">
  Mux's RTMP server URL uses port number 5222 and not the standard RTMP port number 1935.  If your encoder does not provide a method to change the port number, please contact [our support team](/support) with your encoder details.
</Callout>

[Mux Video also supports Secure Reliable Transport (SRT) for receiving live streams](/docs/guides/use-srt-to-live-stream). If you want to live stream with a protocol other than RTMP or SRT, let us know!

<Image src="/docs/images/obs-setup.png" width="1954" height="1492" alt="obs setup" />

The broadcast software will describe how to start and stop an RTMP session. Once the session begins, the software will start pushing live video to Mux and the Live Stream will change its status to `active` indicating it is receiving the RTMP stream and is playable using the Playback ID.

### Broadcasting Webhooks

When a Streamer begins sending video and the Live Stream changes status, your application can respond by using [Webhooks](/docs/core/listen-for-webhooks). There are a few related events that Mux will send. Your application may benefit from some or none of these events, depending on the specific user experience you want to provide.

| Event | Description |
|-------|-------------|
| `video.live_stream.connected` | The Streamer's broadcasting software/hardware has successfully connected with Mux servers. Video is not yet being recorded and is not yet playable. |
| `video.live_stream.disconnected` | The Streamer's broadcasting software/hardware has disconnected from Mux servers, either intentionally or unintentionally because of a network drop. |
| `video.live_stream.recording` | Video is being recorded and prepared for playback. The recording of the live stream (the Active Asset) will include video sent after this point. If your UI has a red "recording" light, this would be the event that turns it on. |
| `video.live_stream.active` | The Live Stream is now playable using the Live Stream's Playback ID or the Active Asset's Playback ID |
| `video.live_stream.idle` | The Streamer's broadcasting software/hardware previously disconnected from Mux servers and the `reconnect_window` has now expired. The recording of the live stream (the Active Asset) will now be considered complete. The next time video is streamed using the same Stream Key it will create a new Asset for the recording. |
| `video.asset.live_stream_completed` | This event is fired by the Active Asset when the Live Stream enters the `idle` state and the Active Asset is considered complete. The Asset's playback URL will switch to being an "on-demand" (not live) video. |

## 4. Playback your live stream

To play back a live stream, use the `PLAYBACK_ID` that was returned when you created the Live Stream along with stream.mux.com to create an HTTP Live Streaming (HLS) playback URL.

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8
```

```android

implementation 'com.google.android.exoplayer:exoplayer-hls:2.X.X'

// Create a player instance.
SimpleExoPlayer player = new SimpleExoPlayer.Builder(context).build();
// Set the media item to be played.
player.setMediaItem(MediaItem.fromUri("https://stream.mux.com/{PLAYBACK_ID}.m3u8"));
// Prepare the player.
player.prepare();

```

```embed

<iframe
  src="https://player.mux.com/{PLAYBACK_ID}?metadata-video-title=Test%20video%20title&metadata-viewer-user-id=user-id-007"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html

<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>

<mux-player
  playback-id="{PLAYBACK_ID}"
  metadata-video-title="Test video title"
  metadata-viewer-user-id="user-id-007"
></mux-player>

```

```react

import MuxPlayer from '@mux/mux-player-react';

export default function VideoPlayer() {
  return (
    <MuxPlayer
      playbackId="{PLAYBACK_ID}"
      metadata={{
        video_id: "video-id-54321",
        video_title: "Test video title",
        viewer_user_id: "user-id-007",
      }}
    />
  );
}

```

```swift

import SwiftUI
import AVKit

private let playbackURL: URL = {
    guard let url = URL(string: "https://stream.mux.com/{PLAYBACK_ID}.m3u8") else {
        preconditionFailure("Invalid playback URL")
    }
    return url
}()

struct ContentView: View {
    private let player = AVPlayer(url: playbackURL)

    var body: some View {
        // VideoPlayer comes from SwiftUI.
        VideoPlayer(player: player)
            .onAppear {
                player.play()
            }
    }
}

```



See the [playback guide](/docs/guides/play-your-videos) for more information about how to integrate with a video player.

After you have everything working [integrate Mux Data](/docs/guides/track-your-video-performance) with your player for monitoring playback performance.

## 5. Stop your broadcast

When the Streamer is finished they will stop the broadcast software/hardware, which will disconnect from the Mux servers. After the `reconnect_window` time (if any) runs out, the Live Stream will transition to a status of `idle`.

<Callout type="info">
  Mux automatically disconnects clients after 12 hours. Contact us if you require longer live streams.
</Callout>

## 6. Manage your Mux Live streams

After you have live streams created in your Mux environment, you may find some of these other endpoints handy:

* <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">Create a live stream</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/list-live-streams">List live streams</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/get-live-stream">Retrieve a live stream</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/delete-live-stream">Delete a live stream</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream-playback-id">Create a live stream playback ID</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/delete-live-stream-playback-id">Delete a live stream playback ID</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/reset-stream-key">Reset a stream key for a live stream</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/signal-live-stream-complete">Signal a live stream is finished</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/disable-live-stream">Disable a live stream</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/enable-live-stream">Enable a live stream</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream-simulcast-target">Create a live stream simulcast target</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/delete-live-stream-simulcast-target">Delete a live stream simulcast target</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/get-live-stream-simulcast-target">Retrieve a live stream simulcast target</ApiRefLink>

More Video methods and descriptions are available at the <ApiRefLink href="/docs/api-reference/video">API Docs</ApiRefLink>.

<GuideCard
  title="Play your live stream"
  description="Set up your iOS application, Android application or web application to start playing your Mux assets"
  links={[
    {title: "Read the guide", href: "/docs/guides/play-your-videos"},
  ]}
/>

<GuideCard
  title="Integrate Mux Data"
  description="Add the Mux Data SDK to your player and start collecting playback performance metrics."
  links={[
    {title: "Read the guide", href: "/docs/guides/track-your-video-performance"},
  ]}
/>


# Configure Broadcast Software
There are a number of popular (and even free) software encoders that you can use with the Mux live streaming API. Hardware encoders that allow for custom RTMP server configuration have similar settings. This guide details how to configure a few common encoders.
## Overview / configuration term glossary

Most broadcasting software uses some standard set of terms. Mux has chosen a set of terms are very commonly used.

* **Server URL** - This is the URL of the Mux RTMP server, as listed in the table below.
* **Stream Key** - The Stream Key is essentially used to authenticate your live stream with the Mux RTMP server. This is your secret key to live streaming. Mux does not use additional authentication.

***

| RTMP Server URL | Description | Common Applications |
| :-- | :-- | :-- |
| rtmp://global-live.mux.com:5222/app | Mux's standard RTMP entry point. Compatible with the majority of streaming applications and services | Open Source RTMP SDKs, most [app-store streaming applications](https://mux.com/blog/guide-to-rtmp-broadcast-apps-for-ios/) |
| rtmps://global-live.mux.com:443/app | Mux's secure RTMPS entry point. Compatible with less streaming applications, but offers a higher level of security | OBS, Wirecast, Streamaxia RTMP SDKs |

***

Here is a list of other terms that we have heard:

* **Stream Name** - A common alias and the technically correct term (in the RTMP specification) for *Stream Key*.
* **Location** or **URL** - Many times, broadcast software that just asks for a location or a URL wants a combination of the *Stream URL* and the *Stream Key* like `rtmp://global-live.mux.com:5222/app/{STREAM_KEY}`. If location or URL are asked for with a stream name/key, then this is an alias for *Server URL*.
* **FMS URL** - Flash Media Server URL, an alias for *Server URL*.

Seen or heard a term that you don't understand? Ask us! Think we missed something that you know? Leave a comment at the bottom of the page!

<Callout type="info">
  Mux's RTMP server URL uses port number 5222 and not the standard RTMP port number 1935. If your encoder does not provide a method to change the port number, please contact [support](/support) with your encoder details.
</Callout>

## Recommended encoder settings

Twitch has a clear and concise [guide to broadcast encoder settings](https://help.twitch.tv/s/article/broadcasting-guidelines?language=en_US). YouTube has [a bit more detailed guide](https://support.google.com/youtube/answer/2853702) as well. Here's a very simple recommendation of where to start, but we do recommend playing with your settings to see what works best for your content:

### Common

* **Video CODEC** - H.264 (Main Profile)
* **Audio CODEC** - AAC

### Great - 1080p 30fps

* **Bitrate** - 5000 kbps
* **Keyframe Interval** - 2 seconds

### Good - 720p 30fps

* **Bitrate** - 3500 kbps
* **Keyframe Interval** - 2 seconds

### Works - 480p 30fps

* **Bitrate** - 1000 kbps
* **Keyframe Interval** - 5 seconds

<Callout type="warning" title="Important">
  You should also consider your available upload bandwidth when choosing an encoder bitrate. For a more reliable connection, we recommend using no more than ~50% of the available upload bandwidth for your live stream ingest.
</Callout>

## Alternate ingest protocols

Mux Video also supports [Secure Reliable Transport (SRT) for receiving live streams](/docs/guides/use-srt-to-live-stream).

## Available Ingest URLs

Mux's regional ingest urls let you manually select your ingest region. This may be useful if you notice DNS is not routing your traffic efficiently, or if you would like to manage your own failover process.

| Region | RTMP Ingest URL | SRT Ingest URL |
| :-- | :-- | :-- |
|Global (Auto-Select) | `rtmp://global-live.mux.com/app` | `srt://global-live.mux.com:6001?streamid={STREAM_KEY}&passphrase={SRT_PASSPHRASE}` |
|U.S. East | `rtmp://us-east.live.mux.com/app` | `srt://us-east.live.mux.com:6001?streamid={STREAM_KEY}&passphrase={SRT_PASSPHRASE}` |
|U.S. West | `rtmp://us-west.live.mux.com/app` | `srt://us-west.live.mux.com:6001?streamid={STREAM_KEY}&passphrase={SRT_PASSPHRASE}` |
|Europe	| `rtmp://eu-west.live.mux.com/app` | `srt://eu-west.live.mux.com:6001?streamid={STREAM_KEY}&passphrase={SRT_PASSPHRASE}` |

<Callout type="info" title="RTMPS support">
  All of these RTMP URLs support RTMPS.

  For example, `rtmp://us-east.live.mux.com/app` becomes `rtmps://us-east.live.mux.com/app`
</Callout>

### Choosing the right ingest URL

* If you want Mux to automatically route to the best region, use `global-live.mux.com`.
* If you prefer manual control over routing, use a specific regional ingest URL (e.g., `us-east.live.mux.com`).
* For redundancy, configure your encoder to failover to another regional endpoint.

### Using regional ingest URLs in OBS

To set up OBS with Mux Live Streaming:

1. Go to: Settings → Stream

2. Select "Custom..." as the service

3. Enter the Ingest URL based on your preferred region

   `rtmps://us-east.live.mux.com/app`

4. Enter your Stream Key (found in your Mux Live settings)

5. Click "Start Streaming"

### Building your SRT URL

Note: Before you use a SRT URL, make sure your encoder supports SRT Caller mode.

The SRT URL is composed of three parts.

1. The protocol and host: `srt://us-east.live.mux.com:6001`
2. A streamid query parameter
3. A passphrase query parameter

Here's an example:

```
srt://us-east.live.mux.com:6001?streamid=abc-123-def-456&passphrase=GHI789JKL101112
```

For more information on SRT, check out our [Use SRT to live stream](https://www.mux.com/docs/guides/use-srt-to-live-stream) docs.

## Software encoders

Any encoder that supports RTMP should work with Mux Video.

* [OBS](https://obsproject.com/) (Free and Open Source)
* [Wirecast](https://www.telestream.net/wirecast/) (Commercial)
* [XSplit](https://www.xsplit.com/broadcaster) (Commercial)
* [vMix](https://www.vmix.com) (Commercial)

## Hardware encoders

Any encoder that supports RTMP should work with Mux Video.

* [VidiU](https://teradek.com/collections/vidiu-family)
* [DataVideo RTMP Encoders](https://www.datavideo.com/global/category/video-encoder)
* [Magewell Ultra Stream](https://www.magewell.com/ultra-stream)
* [Osprey Talon](https://www.ospreyvideo.com/talon-encoders) (contact [sales@ospreyvideo.com](mailto:sales@ospreyvideo.com) for documentation)
* [Videon](https://support.videonlabs.com/hc/en-us/articles/4408934112659-Stream-to-Mux-RTMP-)

## Mobile devices (iOS, Android)

If you just want a pre-built iOS application you can stream from, [check out our write up here](https://mux.com/blog/guide-to-rtmp-broadcast-apps-for-ios/).

If you want to build your own application, [check out this documentation](/docs/guides/live-streaming-from-your-app).


# Use SRT to live stream
Learn how to use the Secure Reliable Transport (SRT) protocol to send a live stream to Mux
## Learn about SRT

SRT is a modern, common alternative to RTMP and is designed for high-quality, reliable point-to-point video transmission over unreliable networks.

The Mux Video SRT feature also supports using [HEVC (h.265) as the live stream input codec](#use-the-hevc-codec-with-srt), reducing inbound bitrate requirements.

SRT is supported by a wide range of free and commercial video encoders.

## Use SRT to connect to a live stream

Authentication for SRT is a little different than with RTMP, and requires two pieces of information:

1. `streamid` This is the same `stream_key` attribute you know & love from RTMP
2. `passphrase` This is a new piece of information exposed in the Live Streams API called `srt_passphrase`. You'll need to use this when your encoder asks you for a `passphrase`.

All new and existing live streams now expose the `srt_passphrase` field.

You can get this field through the API using the <ApiRefLink href="/docs/api-reference/video/live-streams/get-live-stream">Get Live Stream API call</ApiRefLink>:

```json
// GET https://api.mux.com/video/v1/live-streams/{LIVE_STREAM_ID}
{
  // [...]
  "stream_key": "abc-123-def-456",
  "srt_passphrase": "GHI789JKL101112",
  // [...]
}
```

You can also see the SRT connection details from the dashboard of any live stream:

<Image src="/docs/images/srt-dashboard.png" width={1193} height={472} alt="SRT connection details in the Mux Live Stream dashboard" />

## Configure your encoder

Depending on the encoder you're using, the exact path to setting the SRT configuration will vary.

Some encoders accept all configuration parameters in the form of an SRT URL, in which case you'll need to construct an SRT URL as below, substituting the `stream_key` and `srt_passphrase`.

```
srt://global-live.mux.com:6001?streamid={stream_key}&passphrase={srt_passphrase}
```

Mux's global SRT ingest urls will connect you to the closest ingest region. While these ingest URLs typically provide optimal performance, you can also select a specific region using our [regional ingest URLs.](/docs/guides/configure-broadcast-software#available-ingest-urls).

### Common Configuration values

Other encoders will break out the SRT configuration as multiple fields, you should fill them out as below:

| Field | Value |
| --- | --- |
| Hostname / URL / Port | `srt://global-live.mux.com:6001` |
| Stream ID | Use the `stream_key` from your live stream. |
| Passphrase | Use the `srt_passphrase` field from your live stream. |
| Mode | Should be set to `caller` if required. |
| Encryption Key Size / Length | Set to `128` if expressed as “bits”, or `16` if expressed as “pbkeylen” |

### Tuning

You may need some of the tuning settings below.

| Field | Value | Notes |
| --- | --- | --- |
| Latency | `500` is generally a safe starting value | Set to at least 4 x the RTT to `global-live.mux.com`  |
| Bandwidth | `25%` is generally a safe starting value | Set to the percentage of overhead you have available in your internet connection for bursts of retransmission. For example, if you have a 5Mbps internet connection, and you set your encoder's target bitrate to 4Mbps, a value of 25% would be appropriate, as it would allow the encoder to burst to 5Mbps for retransmission purposes. |

## Stream!

If you've configured your encoder correctly, you should be all set to connect your encoder and start streaming. You can then check you see the live stream in your Mux Dashboard.

You will see all the usual state transitions, events, and webhooks that you'd expect when connecting from an RTMP source.

## Example Encoder Configuration

### OBS

OBS accepts the SRT endpoint as a single URL, and should be structured as shown below:

<Image src="/docs/images/srt-obs-config.png" width={677} height={150} alt="SRT configuration for OBS" />

*Stream Key should be left empty as both the Stream ID and the Passphrase are being set in the URL field.*

### Videon

Videon encoders need each parameter to be configured separately, as shown below:

<Image src="/docs/images/srt-videon-config.png" width={544} height={700} alt="SRT configuration for Videon" />

### Wirecast

Wirecast needs each parameter to be configured separately, as shown below:

<Image src="/docs/images/srt-wirecast-config.png" width={1037} height={538} alt="SRT configuration for Wirecast" />

### Larix Broadcaster on iOS and Android

Larix Broadcaster also needs each parameter to be configured separately, as shown below:

<Image src="/docs/images/srt-larix-combined-config.png" width={726} height={700} alt="SRT configuration for Larix Broadcaster on iOS and Android" />

### FFmpeg

FFmpeg takes an SRT URL with the parameters on the URL, for example:

```shell
ffmpeg \
  -f lavfi -re -i testsrc=size=1920x1080:rate=30 \
  -f lavfi -i "sine=frequency=1000:duration=3600" \
  -c:v libx264 -x264-params keyint=120:scenecut=0 \
  -preset superfast -b:v 5M -maxrate 6M -bufsize 3M -threads 4 \
  -c:a aac \
  -f mpegts 'srt://global-live.mux.com:6001?streamid={stream_key}&passphrase={srt_passphrase}'
```

### Gstreamer

Gstreamer takes an SRT URL with the parameters on the URL, for example:

```shell
gst-launch-1.0 -v videotestsrc ! queue ! video/x-raw, height=1080, width=1920 \
  ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=main \
  ! ts. audiotestsrc ! queue ! avenc_aac ! mpegtsmux name=ts \
  ! srtsink uri='srt://global-live.mux.com:6001?streamid={stream_key}&passphrase={srt_passphrase}'
```

## Use the HEVC codec with SRT

When sending a live stream for ingest over SRT, Mux supports [HEVC](https://www.mux.com/video-glossary/hevc-high-efficiency-video-coding) (h.265) as the contribution codec.

Using HEVC generally allows you to reduce the inbound bitrate of your live stream without sacrificing quality. The amount you can reduce the bitrate by will vary depending on the encoder that you're using, but generally this would be between 30% and 50%.

## Simulcast using SRT

You can also now simulcast to SRT destinations for streams that are sent to Mux over SRT.

You can also simulcast streams that were sent over SRT to RTMP destinations.

To configure a simulcast destination as SRT, you can simply pass the SRT URL in the `url` field when creating a Simulcast Target, as shown below:

```json
POST /video/v1/live-streams/{LIVE_STREAM_ID}/simulcast-targets

{
  "url" : "srt://my-srt-server.example.com:6001?streamid=streamid&passphrase=passphrase",
  "passthrough" : "My SRT Destination"
}
```

## Simulcasting and HEVC over SRT

When simulcasting an inbound SRT stream sent over HEVC, Mux does not currently transcode the output stream, so you need to be confident that the simulcast destination supports the HEVC codec.

Below is a current list of the codecs and protocols supported by common simulcast destinations:

| Platform | Protocols | Codecs |
| --- | --- | --- |
| Facebook | RTMP(S) | h.264 |
| X (Twitter) | RTMP(S), HLS Pull | h.264 |
| YouTube | RTMP(S), HLS Pull, SRT (Closed Beta) | h.264, HEVC, AV1 |
| Twitch / IVS | RTMP(S) | h.264 |

## Known limitations

## Simulcast retains source codec

[See simulcasting notes above.](#simulcasting-and-hevc-over-srt)

## Cross-protocol and cross-codec reconnects

We do not support switching ingest protocols or codecs within a reconnect window.  If you want to reuse the same Live Stream with different protocols you'll need to wait for the reconnect period to expire or call the <ApiRefLink href="/docs/api-reference/video/live-streams/signal-live-stream-complete">Complete Live Stream API</ApiRefLink>.

## Embedded Captions

Embedded captions (608) are not supported. Auto-generated captions *can* be used with SRT live streams.

## Multi-track audio

While we support multiple audio tracks in an SRT stream, we recommend against sending more than one, as there's no mechanism to configure which will be used. Mux will choose the first audio stream listed in the PMT; other audio streams will be dropped.

## Feedback

We'd love to hear your feedback as you use SRT. If you run into issues or have feedback, please contact [Mux Support](/support), and we'll get back to you.


# Live streaming from your app
Use this guide to set up your application for live streaming to Mux.
## Live building blocks

A recap from our [Start live streaming](/docs/guides/start-live-streaming) guide:

<Image sm src="/docs/images/live-streaming-overview-2.png" width={1798} height={1040} />

Live streaming from a native application requires software to capture the camera feed and stream it to the Mux live endpoint using the RTMP protocol. Fortunately for both iOS and Android you can find open source software to stream RTMP. The following open source applications can be used as a guide for building live streaming into your own app.

## iOS & Android examples

<Callout type="warning" title="Use current examples">
  Use the examples linked in this guide. They will contain the most current code and issue list. We may not provide support for outdated apps and dependencies.
</Callout>

* [iOS Live Streaming Example](https://github.com/muxinc/examples/tree/master/ios-live-streaming)
* [Android Live Streaming Example](https://github.com/muxinc/examples/tree/master/android-live-streaming)

Over time we'll build out more examples and SDKs for iOS and Android. If you have any feedback or requests please let us know.

If you're looking for a commercial solution, [Streamaxia's OpenSDK](https://www.streamaxia.com/) and [Larix Broadcaster](https://softvelum.com/larix/) are know to work well with Mux's RTMP ingest.

## Web app live streaming

There are not any reliable open source solutions for building web-based encoders for streaming out over RTMP. Check [the blog post](https://mux.com/blog/the-state-of-going-live-from-a-browser) for more information on going live from the browser.


# Reduce live stream latency
This guide covers types of latency, causes of latency, reconnect windows, and lower latency options.
Mux Video live streaming is built with RTMP ingest and HLS delivery. HLS inherently introduces latency.
To the broadcasting industry, this latency is called glass-to-glass latency. Standard glass-to-glass latency with HLS
is greater than 20 seconds and typically about 25 to 30 seconds.

To clarify some terminology and industry jargon:

* **Glass-to-glass latency**: Also sometimes referred to as end-to-end latency. This latency is defined as the time lag between when a camera captures an action and when that action reaches a viewer’s device.
* **Wall-clock time**: Also might be referred to as "realtime". If you have a clock on the wall where you are capturing video content, this would be the time on that clock.

The nature of HLS delivery means that clients are not necessarily synchronized. Some clients might be 15 seconds behind wall-clock time and others might be 30 seconds behind.

# Where does the latency come from?

You don't have to worry about these gritty details when using Mux for live streams, but to give you an idea of how a live stream works:

<Image src="/docs/images/live-stream-workflow.png" width={2250} height={1848} />

1. **Captured by a camera**
2. **Processed by an encoder** - If the computer running the encoder is running out of CPU this process can get behind and start lagging.
3. **Send to an RTMP ingest server** - This server is ingesting the video content in real-time. This part is called the "first mile", it's happening over the internet, often times on consumer or cellular network connections so things like TCP packet-loss and random network disconnects are always happening.
4. **Ingest server decodes and encodes** - Assuming all the content is traveling over the internet fast enough, the encoder on the other end needs to keep up and have enough CPU available to package up segments of video as they come in. The encoder has to ingest video, build up a buffer of content and then start decoding, processing and encoding for HLS delivery.
5. **Manifest files and segments of video delivered** - After all of that, files are created and delivered over HTTP through multiple CDNs to reach end users. Each file becomes available after the entire segment's worth of data is ready. This part also happens over the internet where the same risks around packet-loss and network congestion are factors. Network issues are especially a factor for the last mile of delivery to the end user.
6. **Decoded and played on the client** - When video makes it all the way to the client. The player has to decode and playback the video. Players do not play each segment on the screen as they receive it, they keep a buffer of playable video in-memory which also contributes to the glass-to-glass latency experienced by the end user.

When you consider each of the steps above, any point of that pipeline has the potential to slow down or get backed up. The more latency you can tolerate,
the safer the system is and the lower probability you have for an unhappy viewer. If any single step gets backed up momentarily, the whole system has a chance
to catch up before an interruption in playback. And, when everything is running smoothly, the player has extra time to spend downloading the higher quality version of your content.

<Callout type="info">
  As shown in the image above, there is latency added at every step. However, Mux does not control any latency added during video capture on the camera,
  encoder processing delays, and amount of video buffered & the decoding time of the video player.
</Callout>

# Reconnect Window

When an end-user is streaming from their encoder to Mux, we need to know how to handle situations when the client disconnects unexpectedly.

There are situations when a client disconnects on purpose: for example hitting "Stop streaming" on OBS.
Those are intentional disconnects, we're talking about times when the client just stops sending video.
In order to handle this, live streams have a `reconnect_window`. After an unexpected disconnect,
Mux will keep the live stream "active" for the given period of time and wait for the client to reconnect and start streaming again.

When the `reconnect_window` expires, the live stream transitions back into the `idle` state. In HLS terminology,
Mux writes the `#EXT-X-ENDLIST` tag to the HLS manifest. At this point, your player can consider the live stream to have ended.
By default, `reconnect_window` is `60` (seconds) - you can set this as high as `1800` (30 minutes).

By adding the slate image, you can improve your viewer's video playback experience during the Reconnect window time interval.
You can learn more on Reconnect Windows and Slates [here](/docs/guides/handle-live-stream-disconnects#reconnect-window-and-slates).

# Lower latency Options

Mux live streams have options for "reduced" and "low" latency. The `"latency_mode": "reduced"` option gets your latency down to a range of 12-20 seconds and
the `"latency_mode": "low"` further reduces the latency to as low as 5 seconds. But your viewers might see some variance to the glass-to-glass latency because
the latency depends on many factors, including player configurations, your viewer's geographic location, and their internet connectivity speed.

## Input Requirements

You should only set `latency_mode` to `reduced` or `low` if you have control over the following:

* the encoder software
* the hardware the encoder software is running on
* the network the encoder software is connected to

Typically, home networks in cities and mobile connections are not stable enough to reliably use `reduced` or `low` latency options.

## Create a live stream with the "reduced" latency option

Check out our <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">Live Stream API reference</ApiRefLink> and find the `latency_mode` parameter and set the parameter to `"reduced"` in the request to create a live stream.

Set the latency mode on a live stream to `reduced` by making this POST request:

```json
//POST https://api.mux.com/video/v1/live-streams

{
    "latency_mode": "reduced",
    "reconnect_window": 60,
    "playback_policies": ["public"],
    "new_asset_settings": {
        "playback_policies": ["public"]
    }
}
```

You can read more about the reduced latency feature in this [blog post about Reduced Latency](https://mux.com/blog/reduced-latency-for-mux-live-streaming-now-available/).

## Create a live stream with the "low" latency option

<ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">Create Live Stream</ApiRefLink> is also used to set the `"latency_mode": "low"` flag.

Set the latency mode on a live stream to `low` by making this POST request:

```json
//POST https://api.mux.com/video/v1/live-streams

{
    "latency_mode": "low",
    "reconnect_window": 60,
    "playback_policies": ["public"],
    "new_asset_settings": {
        "playback_policies": ["public"]
    }
}
```

<Callout type="warning">
  A live stream can only be configured as "reduced" or "low" or "standard" latency.
</Callout>

# Low-latency FAQs

## Is low-latency HLS different from standard latency HLS?

**Yes**. The `"latency_mode": "low"` mode uses [Apple's new low-latency HLS (LL-HLS) spec](https://developer.apple.com/documentation/http_live_streaming/enabling_low-latency_hls)
that allows your viewers to play live streams with as low as 5 seconds glass-to-glass latency. In comparison, the other latency modes are based on the
earlier version of the HLS spec which puts a cap on the lower limits of the glass-to-glass latency.

Mux has closely followed the new low-latency HLS spec and published about the development of the new spec a few times on our blog
[in June 2019](https://mux.com/blog/the-community-gave-us-low-latency-live-streaming-then-apple-took-it-away/)
and again [in January 2020](https://mux.com/blog/low-latency-hls-part-2/).

## Do video players support low-latency HLS?

**Yes**. Because `"latency_mode": "low"` mode uses a recent version of Apple's LL-HLS spec, you may need to upgrade your video player. Below is the list of the
most commonly used video players and the minimum version.

<Callout type="info">
  The video player may have supported LL-HLS spec in earlier versions. However, the minimum video player version mentioned below represents
  the version we used for evaluating Mux's low-latency feature.

  <br />

  <br />

  Additionally, the video player companies are continuously improving video
  playback experience as the LL-HLS spec is more widely adopted and used in the real world.
  We recommend updating your video player versions frequently whenever possible to get the latest fixes and improvements.
</Callout>

| Player | Version | Additional details |
| :-- | :-- | :-- |
| HLS.js | >= 1.1.5 | Other potentially relevant issues to track - ([#3596](https://github.com/video-dev/hls.js/issues/3596)) |
| JW Player | >= 8.20.5 | Setting the `liveSyncDuration` [configuration option](https://docs.jwplayer.com/players/reference/setup-options#behavior) can increase latency. So you should not set this option for low-latency playback. |
| THEOplayer | >= 6.0.0 | LL-HLS playback is enabled by default. |
| THEOplayer | >= 2.84.1 | Requires enabling `LL-HLS` add-on on your player instance and set `lowlatency` parameter to `true` in the player configuration. |
| VideoJS | >= 8.0.0 | LL-HLS playback is enabled by default. |
| VideoJS | >= 7.16.0 | Enabling low latency playback requires initializing `videojs-https-streaming` with the `experimentalLLHLS` flag. [See FAQs](/docs/guides/reduce-live-stream-latency#how-do-you-enable-ll-hls-playback-on-videojs-player-prior-to-800).|
| [Mux Player](/docs/guides/mux-player-web) | >= 1.0 | |
| [Mux Video.js Kit](/docs/guides/playback-videojs-with-mux) | >= 0.4 | Mux Video.js Kit uses HLS.js so the same issues apply. |
| Apple iOS (AVPlayer) | 13.\* | Requires `com.apple.developer.coremedia.hls.low-latency` app entitlement for your iOS apps. Also, there are known issues that occasionally cause playback failures. |
| Apple iOS (AVPlayer) | 14.\* | There are known issues that occasionally cause playback issues. |
| Apple iOS (AVPlayer) | 15.\* | |
| Android ExoPlayer | >= 2.14 | |
| Agnoplay | >= 1.0.33 | |

## My video player does not support the LL-HLS spec. Can it still play a low-latency live stream?

**Maybe**. Apple's LL-HLS specification is backward compatible. So your video player should fall back to playing standard HLS.
Those viewers will have noticeably higher glass-to-glass latency. However, your video player does need support for demuxed audio & video tracks
(each track requested separately) in MP4 format for being backward compatible. Most video players already support demuxed audio &
video tracks in MP4 format.

## My video player is running into issues when playing a low-latency live stream. Can I play the same live stream without the low-latency?

**Yes**. You can add the `low_latency=false` parameter to the video playback URL. Mux can revert back to delivering the same live stream
using standard HLS by adding this `low_latency=false` parameter. However, your video player does need support for demuxed audio & video tracks
(each track requested separately) in MP4 format for the `low_latency=false` parameter to work.

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8?low_latency=false
```

If your `playback_id` is `signed`, then all query parameters, including `low_latency` need to be added to the claims body. Take a look at
the [signed URLs guide](/docs/guides/secure-video-playback) for details.

## How do you enable LL-HLS playback on VideoJS player prior to 8.0.0?

You can enable low latency playback using Apple's LL-HLS spec by initializing `videojs-http-streaming`
module with the `experimentalLLHLS` flag along with any other options.

```
var player = videojs(video, {
  html5: {
    vhs: {
      experimentalLLHLS: true
    }
  }
});
```

## Can I change my existing live stream's latency to low latency?

Yes. You can change your existing live stream's latency and set the latency to any of the available options -
`standard` or `reduced` or `low` using the <ApiRefLink href="/docs/api-reference/video/live-streams/update-live-stream">Live Stream PATCH</ApiRefLink>.
You can only execute this API when the live stream status is `idle` and helpful in migrating your live stream based on the streamer's
requirements. After Mux successfully runs this API, your webhook endpoint also receives `video.live_stream.updated` event.

```json
//PATCH https://api.mux.com/video/v1/live-streams/{LIVE_STREAM_ID}

{
    "latency_mode": "low"
}
```


# Show live stream health stats to your streamer
Learn how to get the live stream health stats using the Live Stream Stats API.
In this guide you will learn how to use the Live Stream Health Stats<BetaTag /> API in order to embed the live stream health stats for a particular live stream ID into your applications. A common use case is when you want to show the live stream stats to your streamer during a live event, so that the streamer can monitor the status and take actions when issues occur.

<Callout type="info">
  The Live Stream Stats API is not a 1:1 mapping of what you see on the [Live Stream Health page in the Mux dashboard](/docs/guides/debug-live-stream-issues).
</Callout>

You will use JSON Web Tokens to authenticate to this API.

## 1. Understand Live Stream Stats

The Live Stream Stats API returns **Stream Drift Session Average**, **Stream Drift Deviation From Rolling Average**, and **Status**. Before we dive into each of them, understanding a couple of terms here might be helpful:

* **Wallclock time**: Also called the real-world time.
* **Stream drift**: The difference between elapsed media time and elapsed wallclock time. For example, if your encoder has been connected for 10 seconds and it has sent 5 seconds of media during that time, then your current stream drift would be 5s.

Now keep reading below for the metrics the API returns and their definitions.

### Stream Drift Session Average

**Stream Drift Session Average** is the running average of **stream drift** for the lifetime of an ingest connection. It applies a smoothing function to the potentially jagged, fluctuating raw metric. Use this metric as an indication of the average offset between the elapsed wallclock time and media time throughout the whole session.

The value we return from the API is measured in miliseconds and is continuously updated with each measurement taken. It is reset whenever the encoder disconnects.

### Stream Drift Deviation From Rolling Average

To get an indication of whether the current drift is consistent (good) or growing (bad), use **Deviation From Rolling Average**. It is the difference between current **stream drift** and current **stream drift rolling average**. The rolling average only takes the last ~30s of data into account, so it represents the recent drift, rather than measurements taken potentially long time ago. Disparities between current drift and the rolling average can be a good indicator because session average moves slower and may not reflect the latest status.

Use this metric to understand whether the stream is experiencing issues at the moment. When it is, the **Deviation From Rolling Average** will likely be high.

### Status

The `status` returned from the Live Stream Health API could be any of the following values: `excellent`, `good`, `poor`, or `unknown`.

* `excellent`: The Stream Drift Deviation From Rolling Average is less than or equal to 500ms
* `good`: The Stream Drift Deviation From Rolling Average is less than or equal to 1s but greater than 500ms
* `poor`: The Stream Drift Deviation From Rolling Average is greater than 1s
* `unknown`: We are unable to calculate the stream drift. This is usually because the live stream is inactive and/or we have not received any data about it for a few minutes.

Use `status` as an indicator of the latest health status of the live stream ingest. A common use case is to render color coded UI for your streamer's ease-of-use based on the status information, such as green, yellow, or red. You can also check out our pre-built UI to monitor the status by going to the Mux Dashboard for the specific live stream.

## 2. Create a Signing Key

Signing keys can be managed (created, deleted, listed) from the [Signing Keys settings](https://dashboard.mux.com/settings/signing-keys) of the Mux dashboard or via the Mux System API.

<Callout type="warning">
  When making a request to the System API to generate a signing key, the access
  token being used must have the System permission. You can confirm whether your
  access token has this permission by going to Settings > API Access Token. If
  your token doesn't have the System permission listed, you'll need to generate
  another access token with all of the permissions you need, including the
  System permission.
</Callout>

When creating a new signing key, the API will generate a 2048-bit RSA key pair and return the private key and a generated key ID; the public key will be stored at Mux to validate signed tokens. Store the private key in a secure manner.

You probably only need one signing key active at a time and can use the same signing key when requesting live stream stats for multiple live streams. However, you can create multiple signing keys to enable key rotation, creating a new key and deleting the old only after any existing signed URLs have expired.

### Example request

```bash
curl -X POST \
-H "Content-Type: application/json" \
-u ${MUX_TOKEN_ID}:${MUX_TOKEN_SECRET} \
'https://api.mux.com/system/v1/signing-keys'
```

### Example response

```json
// POST https://api.mux.com/system/v1/signing-keys
{
  "data": {
    "private_key": "(base64-encoded PEM file with private key)",
    "id": "(unique signing-key identifier)",
    "created_at": "(UNIX Epoch seconds)”
  }
}
```

<Callout type="warning">
  Be sure that the signing key's environment (Staging, Production, etc.) matches
  the environment of the live streams you would like to call for! When creating a signing
  key via API, the environment of the access token used for authentication will
  be used.
</Callout>

This can also be done manually via the UI. If you choose to create and download your signing key as a PEM file from UI, you will need to base64 encode it before using it with (most) libraries.

```bash
❯ cat /path/to/file/my_signing_key.pem | base64
LS0tLS1CRUdJTiBSU0EgUFJJVkFURSBLRVktL...
```

## 3. Generate a JSON Web Token

The following JWT claims are required:

| Claim Code | Description                | Value                                                                                                                                                              |
| :--------- | :------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `sub`      | Subject of the JWT         | The ID for which counts will be returned                                                                                                                           |
| `aud`      | Audience (identifier type) | `live_stream_id` (Mux Video Live Stream ID) |
| `exp`      | Expiration time            | UNIX Epoch seconds when the token expires. Use this to ensure any tokens that are distributed become invalid after a period of time.                               |
| `kid`      | Key Identifier             | Key ID returned when signing key was created                                                                                                                       |

<Callout type="warning">
  Live Stream ID is available to Mux
  Video customers only and is generated by Mux. Be sure to double check both
  the query ID type and value!
</Callout>

### Expiration time

Expiration time should be at least the duration of the live stream. When the signed URL expires, you will no longer be able to receive live stream stats data from the API.

<Callout type="info">
  [See the related video documentation](/docs/guides/secure-video-playback#expiration-time)
</Callout>

## 4. Signing the JWT

The steps can be summarized as:

1. Load the private key used for signing
2. Assemble the claims (`sub`, `aud`, `exp`, `kid` etc) in a map
3. Encode and sign the JWT using the claims map and private key and the RS256 algorithm.

There are dozens of software libraries for creating and reading JWTs. Whether you’re writing in Go, Elixir, Ruby, or a dozen other languages, don’t fret, there’s probably a JWT library that you can rely on. For a list of open source libraries to use, check out [jwt.io](https://jwt.io/libraries).

<Callout type="warning">
  The following examples assume you're working with either a private key
  returned from the API, or copy & pasted from the Dashboard, **not** when
  downloaded as a PEM file. If you've downloaded it as a PEM file, you will need
  to base64 encode the file contents.
</Callout>

```go

package main

import (
    "encoding/base64"
    "fmt"
    "log"
    "time"
    "github.com/golang-jwt/jwt/v4"
)

func main() {

    myId := ""       // Enter the id for which you would like to get counts here
    myIdType := ""   // Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
    keyId := ""      // Enter your signing key id here
    key := ""        // Enter your base64 encoded private key here

    decodedKey, err := base64.StdEncoding.DecodeString(key)
    if err != nil {
        log.Fatalf("Could not base64 decode private key: %v", err)
    }

    signKey, err := jwt.ParseRSAPrivateKeyFromPEM(decodedKey)
    if err != nil {
        log.Fatalf("Could not parse RSA private key: %v", err)
    }

    token := jwt.NewWithClaims(jwt.SigningMethodRS256, jwt.MapClaims{
        "sub": myId,
        "aud": myIdType,
        "exp": time.Now().Add(time.Minute * 15).Unix(),
        "kid": keyId,
    })

    tokenString, err := token.SignedString(signKey)
    if err != nil {
        log.Fatalf("Could not generate token: %v", err)
    }

    fmt.Println(tokenString)
}

```

```node

// using @mux/mux-node@8

import Mux from '@mux/mux-node';
const mux = new Mux();
const myId = ''; // Enter the id for which you would like to get counts here
const myIdType = ''; // Enter the type of ID provided in myId; one of video_id | asset_id | playback_id | live_stream_id
const signingKeyId = ''; // Enter your Mux signing key id here
const privateKeyBase64 = ''; // Enter your Mux base64 encoded private key here

const getViewerCountsToken = async () => {
    return await mux.jwt.signViewerCounts(myId, {
        expiration: '1 day',
        type: myIdType,
        keyId: signingKeyId,
        keySecret: privateKeyBase64,
    });
};

const sign = async () => {
    const token = await getViewerCountsToken();
    console.log(token);
};

sign();

```

```php

<?php

  // Using https://github.com/firebase/php-jwt

  use \Firebase\JWT\JWT;

  $myId = "";       // Enter the id for which you would like to get counts here
  $myIdType = "";   // Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
  $keyId = "";      // Enter your signing key id here
  $keySecret = "";  // Enter your base64 encoded private key here

  $payload = array(
  "sub" => $myId,
  "aud" => $myIdType,
  "exp" => time() + 600, // Expiry time in epoch - in this case now + 10 mins
  "kid" => $keyId
  );

  $jwt = JWT::encode($payload, base64_decode($keySecret), 'RS256');

  print "$jwt\n";

?>

```

```python

# This example uses pyjwt / cryptography:
# pip install pyjwt
# pip install cryptography

import jwt
import base64
import time

my_id = ''              # Enter the id for which you would like to get counts here
my_id_type = ''         # Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

private_key = base64.b64decode(private_key_base64)

payload = {
    'sub': my_id,
    'aud': my_id_type,
    'exp': int(time.time()) + 3600, # 1 hour
}
headers = {
    'kid': signing_key_id
}

encoded = jwt.encode(payload, private_key, algorithm="RS256", headers=headers)
print(encoded)

```

```ruby

require 'base64'
require 'jwt'

def sign_url(subject, audience, expires, signing_key_id, private_key, params = {})
    rsa_private = OpenSSL::PKey::RSA.new(Base64.decode64(private_key))
    payload = {sub: subject, aud: audience, exp: expires.to_i, kid: signing_key_id}
    payload.merge!(params)
    JWT.encode(payload, rsa_private, 'RS256')
end

my_id = ''                 # Enter the id for which you would like to get counts here
my_id_type = ''            # Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
signing_key_id = ''        # Enter your signing key id here
private_key_base64 = ''    # Enter your base64 encoded private key here

token = sign_url(my_id, my_id_type, Time.now + 3600, signing_key_id, private_key_base64)

```



## 5. Making a Request

Supply the JWT in the resource URL using the `token` query parameter. The API will inspect and validate the JWT to make sure the request is allowed.

Example:

```bash
curl 'https://stats.mux.com/live-stream-health?token={JWT}'
```

Response:

```json
{
  "data": [
    {
      "ingest_health": {
        "updated_at": "2022-11-14T17:32:23",
        "stream_drift_session_avg": 384,
        "stream_drift_deviation_from_rolling_avg": 12,
        "status": "excellent",
        },
    },
  ],
}
```

* `stream_drift_session_avg` is the session average of stream drift. Use this to represent the overall health of the stream.
* `stream_drift_deviation_from_rolling_avg` is the delta between the current stream drift and the rolling average. Use this to represent the latest stream health.


# Manage stream keys
Learn how to manage stream keys and enable/disable access to go live.
When live streaming, a stream key is used by a broadcaster to receive a live stream for a Mux account. Stream keys, by nature, are private and should be handled with care. This means that access to the stream key should be reserved for the broadcaster and hidden from end users.

## 1. Use case - Single stream

Single live stream configurations are great for when only one stream will ever be active at a time, or for disposable, single-use live streams.

For example, if you are hosting a conference where the agenda is a back-to-back track of speakers, a single live stream is used in this scenario.

## 2. Use case - Multiple streams

Creating multiple live stream configurations are implemented in situations where multiple live streams are expected. Some reasons you might choose this live stream configuration would include—

* Multiple concurrent streams that overlap in when they go live
* User generated content where going live can happen at any time and there is no established schedule

## Concurrent live streams

When working with multiple streams that can overlap in realtime, use multiple live stream configurations. Each live stream configuration can be tied to each live stream event.

For example, if you are hosting different concurrent events, each event would need an individual live stream configuration created.

If you want to control the ability to accept a live stream, you can use the <ApiRefLink href="/docs/api-reference/video/live-streams/enable-live-stream">enable live stream</ApiRefLink> and <ApiRefLink href="/docs/api-reference/video/live-streams/disable-live-stream">disable live stream</ApiRefLink> API endpoints. These endpoints can be called based on your business logic from your CMS/backend to control your content creator's ability to go live.

## User generated content

If your solution allows your users to go live at any time, a live stream configuration for each potential content creator will need to be created. As you will see in the following, the Mux live stream configuration `id` will be tied to each content creator using your service.

When provisioning your user as a content creator, <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">create a live stream</ApiRefLink> configuration that will be used solely by *this* content creator. The `data.id` response value needs to be stored within your CMS so that it can be used to deliver the live stream to end users when the content creator goes live. A live stream configuration created for a content creator can be reused by that content creator over their life span.

The `data.stream_key` value *could* also be stored in the CMS in case the content creator wants to recall the stream key at a later time.

Another option is to pass through the stream key to the content creator at provision time without storing the stream key. A common use-case that we support is for the ability to reset the stream key for a given live stream configuration. To do this, Mux offers a <ApiRefLink href="/docs/api-reference/video/live-streams/reset-stream-key">reset stream key</ApiRefLink> API.

## Advanced options

### Reset stream key

If a stream key needs to be reset for a live stream configuration because it was lost or compromised, the <ApiRefLink href="/docs/api-reference/video/live-streams/reset-stream-key">reset stream key</ApiRefLink> can be used to regenerate the stream key.

### Complete live stream

Typically, when a content creator has end their live stream session by stopping a stream, Mux will wait for the duration configured for the live stream's `reconnect_window` before making it available as an on-demand asset.

To make a live stream available immediately, you can <ApiRefLink href="/docs/api-reference/video/live-streams/signal-live-stream-complete">signal live stream complete</ApiRefLink> to immediately make the live stream available as an on-demand asset.

<Callout type="info">
  Mux does not close the encoder connection immediately. Encoders are often
  configured to re-establish connections immediately which would result in a new
  recorded asset. For this reason, Mux waits for 60s before closing the
  connection with the encoder. This 60s timeframe is meant to give encoder
  operators a chance to disconnect from their end.
</Callout>

## Enable live stream

To enable a live stream configuration so that it is able to receive an RTMP session, call the <ApiRefLink href="/docs/api-reference/video/live-streams/enable-live-stream">enable live stream</ApiRefLink> API endpoint.

<Callout type="info">
  By default, all newly created live stream configurations are enabled.
</Callout>

## Disable live stream

Should you want to disable a live stream configuration so that it no longer accepts RTMP sessions, the <ApiRefLink href="/docs/api-reference/video/live-streams/disable-live-stream">disable live stream</ApiRefLink> is used to achieve this use case.

<Callout type="info">
  Unlike <ApiRefLink href="/docs/api-reference/video/live-streams/signal-live-stream-complete">signal live stream complete</ApiRefLink>, Mux closes the encoder connection immediately with this API. Any attempt
  from the encoder to re-establish the connection will fail until the live
  stream is re-enabled.
</Callout>


# Stream recordings of live streams
Every live stream on Mux is automatically recorded as an `asset`.
When playing back a live stream, Mux offers you two options for historical playback. Both options are available at any time; you can switch between the two at will.

1. **Non-DVR mode** - keep all users "live". Only a small portion of your live stream (approximately 30 seconds) will be exposed to your viewers through your player.
2. **DVR mode** - allow your users to scrub back to the beginning of the live stream whenever they want in your player.

The 3 Mux concepts we need to understand here are:

* <ApiRefLink href="/docs/api-reference/video/live-streams">Live streams</ApiRefLink>: This is the top level live streaming resource. Your stream key maps back to a single live stream. Live streams are reusable. Each Live stream has one or more playback IDs associated with it.
* <ApiRefLink href="/docs/api-reference/video/assets">Assets</ApiRefLink>: Assets are videos on demand. In Mux, assets get created by either: direct uploads, creating an asset with an input URL, or from recordings of live streams. Each Asset has one or more playback IDs associated with it.

{/* we have to add that darn <i></i> to disable gfm autolink literals */}

* <ApiRefLink href="/docs/api-reference/video/playback-id">Playback IDs</ApiRefLink>: Playback IDs are the resource that controls playback. A playback ID may point to either a live stream OR an asset and it can be either public or signed. More information on [signed playback IDs is here](/docs/guides/secure-video-playback). A playback ID is the identifier that you use in a `stream.mux.com` URL of the form: <pre>https://<i />stream.mux.com/{"{"}PLAYBACK\_ID{"}"}.m3u8</pre>.

## DVR mode vs. non-DVR mode

In non-DVR Mode, all users viewing the live stream will be viewing the most recent content. The player will have access to approximately the most recent 30 seconds of content.

In order to use non-DVR mode, construct your playback URL with the playback ID *associated with the live stream*.

<Callout type="success" title="Non-DVR Mode is most common">
  Most uses of live streaming opt for non-DVR mode and, if you are unsure about which to use, we recommend that you stick with non-DVR mode.
</Callout>

When using DVR mode, the player will have access to your stream's content going all the way back to the beginning of the live stream.

<Callout type="warning" title="Be careful with DVR Mode and long lived streams">
  Mux does not recommend DVR mode for live streams longer than four hours. If you expect long live streams, you should use non-DVR mode.
</Callout>

If you choose to use DVR mode, then you should construct your playback URL using the playback ID *associated with the live stream's `active_asset_id`*.

## Assets created from live streams

Mux will automatically start creating an asset in the background when you begin broadcasting to your live stream. This asset has two purposes:

* You can use the asset directly in order to enable DVR mode playback.
* When the live stream is over, you can use the asset to play back the recording of the live stream.

Since assets are automatically created from every live stream and live streams can be re-used as many times as you want, Mux creates a new asset every time a live stream begins broadcasting. A single live stream can end up producing an indefinite number of assets.

The lifecycle of events produced by a Mux live stream is as below.

| Step | Event | Description |
|------|-------|-------------|
| 1 | Initial State | Live stream begins in status `idle` |
| 2 | `video.live_stream.connected` | The encoder has connected. At this point in time the live stream will have a new `active_asset_id`. The `active_asset_id` is the ID that points to a new asset that Mux is creating for this live stream. |
| 3 | `video.asset.created` | The asset corresponding to the `active_asset_id` from step 2 gets created. This asset has a `live_stream_id` that points back to the live stream it was created from. This asset does not have any content yet, it is a placeholder that will be getting content from the ingested live stream. |
| 4 | `video.live_stream.recording` | Mux has started recording the incoming content. The live stream's status will still be `idle` at this point. |
| 5 | `video.live_stream.active` | The live stream's state has transitioned `active`. **When in non-DVR mode**, the live stream's playback ID can now be used to build a playback URL on `stream.mux.com`. |
| 6 | `video.asset.ready` | The asset (`active_asset_id`) from step 2 and 3 will be "ready" at around the same time that the live stream is "active". This asset only has about 10 seconds worth of content at this point. The `duration` on this asset reflects the current playable duration. If you are using DVR mode, it is at this point that you can use the `active_asset_id` to build a playback URL on `stream.mux.com`. |
| 7 | `video.live_stream.disconnected` | The encoder has disconnected, and the live stream status is still `active`. Please note that live streams that do not use the `"latency_mode": "reduced"` option will enter a reconnect window (defaulting to a duration of 60 seconds) after disconnecting. The encoder can re-connect within this reconnect window and, in doing so, pick back up where it left off with the same `active_asset_id`. For more information, please consult [handling live stream disconnects](/docs/guides/handle-live-stream-disconnects). |
| 8 | `video.live_stream.idle` | After the encoder has stayed disconnected for the duration of the reconnect window, the live stream will transition back to status `idle`. This live stream will no longer have an active asset associated with it, but for ease of use this event will include the `active_asset_id` of the asset that is just ending. The next time an encoder connects this lifecycle with start back at step 1 with a new `active_asset_id`. |
| 9 | `video.asset.live_stream_completed` | This event fires at the same time as the live stream transitions back to `idle`. This event tells you that the asset is finalized. The `duration` of the asset will now be the full, finalized duration; you can use the playback ID in your player to play the recording of the live stream. |

Please note that some of these webhook events correspond to the `live_stream` resource and others correspond to the `asset`.

More information about configuring and using webhooks can be found in the [webhooks guide](/docs/core/listen-for-webhooks).


# Stream live to 3rd party platforms
Also known as Restreaming, Live Syndication, Rebroadcasting, or RTMP Passthrough.
## 1. What does Simulcasting do?

With the Simulcasting feature, developers can enable their users publish live streams on social platforms.

The Mux Video API makes it easy for developers to build live streaming into their applications. Combined with simulcasting, existing features like Persistent Stream Keys and Automatic Live Stream Recording together provide a way to connect with a number of social sharing apps.

What Simulcasting can help you do:

* Forward a live stream on to social networks like YouTube, Facebook, and Twitch
* Let users publish user-generated live streams on social platforms
* Connect with a number of social sharing apps

<Callout type="info" title="Other names for Simulcasting">
  Other domains may use varying terminology to refer to the same general process including:

  * Restreaming
  * Live Syndication
  * Rebroadcasting
  * RTMP Passthrough
  * [Multistreams - a term used by Crowdcast](https://www.crowdcast.io/multistreams)
</Callout>

## 2. Select a Simulcast Target supported by Mux

Mux Simulcasting works with any arbitrary RTMP server. That means Mux will support Simulcast Targets from any platform that supports the RTMP or RTMPS protocol.

Targets that are supported include but are not limited to the following:

* Facebook Live
* YouTube Live
* Twitch
* Crowdcast
* Vimeo

Unfortunately the following Targets are not supported:

* Instagram (you can only go live from the Instagram app)

## 3. Add simulcasting to a Mux live stream

Use the Mux API to add simulcasting to a live stream.

The first step is to add a Simulcasting Target. You can do this when the Live Stream object is first created, or anytime afterward. Note that Simulcast Targets can only be added while the Live Stream object is not active.

Here is an example of adding a Simulcasting Target for each additional platform the stream should be published to:

```text
POST https://api.mux.com/video/v1/live-streams
```

```text
{
  "playback_policies": [
    "public"
  ],
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ]
  },
  "simulcast_targets" : [
    {
      "url" : "rtmp://a.rtmp.youtube.com/live2",
      "stream_key" : "12345",
      "passthrough" : "YouTube Example"
    },
    {
      "url" : "rtmps://live-api-s.facebook.com:443/rtmp/",
      "stream_key" : "12345",
      "passthrough" : "Facebook Example"
    }
  ]
}
```

## 4. Find your RTMP Credentials on any supported platform

As defined in the <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream-simulcast-target">Simulcast Targets API reference</ApiRefLink>, RTMP credentials consist of two parts:

* a `url` , which is the RTMP hostname including the application name for the third party live streaming service
* a `stream_key`, which is the password that represents a stream identifier for the third party live streaming service to simulcast the parent live stream to.

Note that stream keys are sensitive and should be treated with caution the same way you would with an API key or a password.

<Callout type="info" title="Specific examples, steps, and setting recommendations">
  New to live streaming? In this blog post we provide a step-by-step outline how to use Twitch, YouTube, or Facebook for getting RTMP credentials.

  Not sure what settings to use?
  As for settings, a recommendation for your end users is 4,000 kbps at 720p resolution with 2s keyframe intervals. However, this post also provides an in-depth explanation for choosing personalized settings.

  [Help Your Users be in 5 Places at Once: Your Guide to Simulcasting](https://mux.com/blog/help-your-users-be-in-5-places-at-once-your-guide-to-simulcasting/)
</Callout>

## 5. More information about simulcasting

### Pricing

Simulcasting has an added cost on top of live streaming, but like all of our pricing, you only pay for what you use.

See the [Pricing Page](https://mux.com/pricing/) for details.

### Availability

There's a limit of 6 simulcasts/restreams per live stream. Let us know if you have a use case that requires more.

### Blog Posts about Simulcasting

We have several blog posts that cover more topics about simulcasting products, if you want to read more:

* [Seeing double? Let your users simulcast (aka restream) to any social platform](https://mux.com/blog/seeing-double-let-your-users-simulcast-a-k-a-restream-to-any-social-platform/)
* [Help Your Users be in 5 Places at Once: Your Guide to Simulcasting](https://mux.com/blog/help-your-users-be-in-5-places-at-once-your-guide-to-simulcasting/)


# Handle Live Stream Disconnects
In this guide we will walk through how to handle disconnects that happen during live streams.
## How Mux handles disconnects

Before reading this guide, you created and set up a Live Stream by following these steps:

* You have connected your encoder (for example OBS, Wirecast, your live streaming app) to an RTMP ingest server as covered in this guide: [Configure Broadcast Software](/docs/guides/configure-broadcast-software)).
* Mux sends the `video.live_stream.connected` event to your environment.
* When the encoder starts sending media to the ingest server, the webhook events `video.live_stream.recording` and then `video.live_stream.active` are delivered to your environment.

If everything goes smoothly, the encoder will keep sending media and the server will keep processing it, creating video segments and
updating the HLS playlists with new pieces of video (to understand how this
works read [Reduce live stream latency](/docs/guides/reduce-live-stream-latency)).  Since all of
this streaming is happening live, the ingest server needs to know what it should do when the encoder disconnects unexpectedly.

What happens when the live stream disconnects either intentionally or due to a drop in the network? Mux sends the `video.live_stream.disconnected`
event for the live stream to your environment. This is where the `reconnect_window` comes into play.

# Reconnect Window

When you <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">create the Live Stream</ApiRefLink> or <ApiRefLink href="/docs/api-reference/video/live-streams/update-live-stream">update the Live Stream</ApiRefLink>, you can set the `reconnect_window` parameter in the Request JSON.

The Reconnect Window is the time in seconds that Mux should wait for the live stream broadcasting software to reconnect before considering the live stream finished
and completing the recorded asset. As a default, Mux sets `reconnect_window` to 60 seconds for Standard Latency streams and zero seconds for Reduced and Low Latency streams, but this can be adjusted to any value between 0 to 1800 seconds.

<Callout type="info">
  Reconnect Window is supported for all latency modes of the live stream, including "standard", "reduced" and "low".
</Callout>

### Reconnect Window and slates

When you <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">create the Live Stream</ApiRefLink> or <ApiRefLink href="/docs/api-reference/video/live-streams/update-live-stream">update the Live Stream</ApiRefLink>,
you can set the `reconnect_slate_url` parameter with the URL of the slate image.

Slate insertion can help output a live stream for viewers without interruptions. Below are some examples where Mux receives an imperfect stream and how Mux handles the output:

* If the input contains only audio for the relevant time, the most recent video frame is duplicated
* If the input contains only video for the relevant time, Mux will output silent audio
* If a slate is inserted and the input has no audio or video (including because the encoder was disconnected) a slate period begins where Mux will output silent audio, and duplicate the most recent video frame. After 0.5 seconds, Mux will switch to the slate image and continue to send silent audio. If the encoder is still connected, Mux will disconnect the encoder after 5 minutes. Mux will then continue inserting slates for up to the duration of the `reconnect_window` in seconds. Viewers may experience a maximum slate duration of up to 5 minutes over the `reconnect_window` duration

When Mux stops receiving the media, Mux adds the slate image as a video frame to the live stream. This event of not receiving media disconnects the encoder and starts the `reconnect_window` time interval.
Mux stops adding the slate image when Mux starts receiving media again or the reconnect window time interval expires.

Enable slates for `standard`, `reduced`, and `low` latency mode live streams:

* For `standard` latency live streams, set the `use_slate_for_standard_latency` parameter to `true` and make sure the `reconnect_window` parameter value is greater than 0s. Live streams, created before the slate image functionality was available, will not automatically start using slates until this parameter is set.
* For `reduced` and `low` latency mode live streams, set the `reconnect_window` parameter value to greater than 0s.

Mux selects one of the following images as the default slate image depending on the live stream's video aspect ratio. The default slate image is used unless you
set the `reconnect_slate_url` parameter. We recommended setting the slate image whose aspect ratio matches the live stream's video aspect ratio. You can modify
the `reconnect_slate_url` parameter using the <ApiRefLink href="/docs/api-reference/video/live-streams/update-live-stream">update the Live Stream</ApiRefLink>.

<Image src="/docs/images/slate-laptop-illustration-horizontal.png" width={640} height={360} caption="Horizontal Laptop Illustration" />

<Image src="/docs/images/slate-laptop-illustration-vertical.png" width={360} height={640} caption="Vertical Laptop Illustration" sm />

Mux downloads the slate image, hosted at the URL set as `reconnect_slate_url` parameter value, at the start of the live stream recording.
So, you must ensure the image is always downloadable from the URL. When Mux can not download the image, the default slate image (shown above) is used
and the video.live\_stream.warning for the live stream as well as the `video.asset.warning` webhook event for the asset is fired. Below is an example
of the webhook event body:

```json
{
    "type": "video.live_stream.warning", // or "video.asset.warning"
    "object": {
      "type": "live",
      "id": "CiinCsHA2EbsU00XwzherzjWAek3VmtUz8"
    },
    "id": "3a56ac3d-33da-4366-855b-f592d898409d",
    "environment": {
      "name": "Production",
      "id": "j0863n"
    },
    "data": {
      "warning": {
        "type": "custom_slate_unavailable",
        "message": "Unable to download custom reconnect slate image from URL 'http://example.com/bad_url.png' -- using black frames for slate if needed."
      },
      "stream_key": "5203dc64-074a-5914-0dfc-ce007f5db53a",
      "status": "idle",  // or "preparing"
      "id": "CiinCsHA2EbsU00XwzherzjWAek3VmtUz8",
      "active_asset_id": "0201p02fGKPE7MrbC269XRD7LpcHhrmbu0002"
    },
    "created_at": "2022-07-14T21:08:27.000000Z",
    "accessor_source": null,
    "accessor": null,
    "request_id": null
  }
```

<Callout type="info">
  The `status` parameter in the webhook event body (shown above) is `idle` for the live stream and `preparing` for the asset
  event to match the corresponding `status` parameter values.
</Callout>

## How to handle reconnects from the Player/Client side without a slate image

When Mux is not receiving any media, the viewer experience depends on whether Mux starts receiving media before the reconnect window expires.
We strongly recommend <ApiRefLink href="/docs/api-reference/video/live-streams/update-live-stream">updating the Live Stream</ApiRefLink> to add a slate image.
However, there are two possible scenarios when you do not want to add the slate image.

### In scenario 1, the encoder re-connects

The ingest server will wait for the duration of the `reconnect_window` before it ends the live stream. While the encoder is disconnected, media is no longer being sent, so the HLS playlists are not getting new segments of video.

<Callout type="info">
  # Stalled player during live stream

  A stalled player during a live stream happens when the live stream is still active, but the HLS manifest file is not getting new video segments appended to it.

  The player will enter a `stalled` state if it runs out of buffer. To avoid this, consider adding extra buffer to your player.
</Callout>

If the encoder reconnects before the `reconnect_window` expires then the HLS playlist will resume appending new video segments to the live stream.

### In scenario 2, the encoder disconnects

If the encoder does not reconnect before the `reconnect_window` expires, the following events will occur:

1. Mux writes an `EXT-X-ENDLIST` tag to the HLS playlist. According to the HLS specification: *EXT-X-ENDLIST: Indicates that no more media files will be added to the playlist file*. This tells the player **this stream is over** and no more media is coming. Your player should emit an `ended` event, or something equivalent.
2. The live stream will transition from `active` back to `idle`
3. Mux will create a new asset. The `active_asset_id` while the live stream was active will be finalized. If the same live stream goes live *again* at a later time, then the live stream will get a new `active_asset_id` and a new asset will be created.


# Stream simulated live
Appear to be broadcasting live, but use a pre-recorded video.
<Callout type="warning" title="Other terms for Simulated Live">
  * Pre-Recorded Live
  * Scheduled Live
  * Playout Service
  * Simulated Live from VOD
  * Psuedo-live
  * Live Linear Channel
</Callout>

You may have a pre-recorded video and want to use Mux to broadcast it as if it were live.

For now, Mux does not support Simulated Live streaming directly as a feature. As a work-around, this guide provides a few options to implement your own Simulated Live streaming solution.

Simulated Live streaming is a common strategy to ensure reliability. For example, if your platform has groups of users watching content simultaneously, you will want to employ one of the following strategies.

<Callout type="info" title="Before reading on...">
  You should be familiar with how live streaming works:

  1. Create a live stream with the Mux API
  2. Get the unique stream key for that live stream
  3. Put the server URL and stream key into an encoder (OBS, Wirecast, etc.)
  4. Use the Playback ID to view your live stream in any player that supports HLS
</Callout>

## Option 1: Use a 3rd party service to send an RTMP stream

The most straightforward and reliable option we recommend is to use a third party service built for Simulated Live streaming. The service will allow you to upload videos and send out an RTMP stream at a scheduled time.

Upload your video to the service, enter in the Mux `rtmp` ingest server details, and schedule the time you want it to "go live".

For example, [restream.io](https://restream.io/) supports streaming pre-recorded videos. Note that there is a cost associated with this option.

## Option 2: Build your own server to send an RTMP stream

The second option we recommend is to build your own server that is capable of uploading video and sending an RTMP stream to Mux.

To do this, run encoder software that can ingest a video file and sends output to a Mux RTMP ingest URL. Software you might use to build a server include [ffmpeg](https://ffmpeg.org/) or [GStreamer](https://gstreamer.freedesktop.org/).

If you are going with this "home-rolling" route, your program should:

* Handle network blips gracefully. Even if your server is running in a reliable cloud like AWS or Google, networking between commercial data centers may experience interruptions.

* Handle disconnects. In particular, `ffmpeg` does not have any built in disconnect handling so if you use that software you should make sure you have a solution to handle them.

* Hold up to rigorous testing. Test the program with different types of content and long running streams. Make sure what you built is reliable before you use it in production.

## Option 3: Use on-demand and simulate live in the UI

The final option is to skip the backend live streaming setup, use an on-demand video, and make it "appear live" in your UI. This is a work-around we have seen success with.

To simulate a Live Stream in the UI you could:

* Hide the timeline of the player so that users can't seek back and forth
* Have the client make requests to the server to check server-time and use the server-time to keep the playhead synced to the "current live" time
* Show a red dot that gives the impression to the user "this is live"

## Provide your feedback

We'd love to hear what is working and what isn't working, so if you are using one of these solutions (or some other solution), please send your ideas.

If you are interested in Simulated Live streaming as a Mux feature, let us know about your use case and specific needs!


# Debug live stream issues
Learn how to debug live streams and identify the most commonly seen live stream issues with the Live Stream Input Health dashboard.
## Navigate to the Live Stream Input Health dashboard

The Live Stream Input Health dashboard is a real-time dashboard that provides visibility on how Mux receives your live stream from the encoder. When a sizable percentage of your viewers complain about their viewing experience or your configured Mux Data Alert fires, a good starting point for identifying the problem is understanding the live stream's health. The video below shows how to navigate to your Live Stream Input Health dashboard.

<Image src="/docs/images/navigate-to-live-stream-input-health.gif" width={1666} height={1088} alt="Navigate to live stream input health" />

## Healthy live stream

Let's first look at a healthy live stream in the dashboard.

<Image src="/docs/images/health-live-stream.png" width={2322} height={724} alt="Health live stream" />

A few key points to notice from this graph that indicate this is a healthy live stream:

* Mux is receiving consistent frames per second. Receiving inconsistent frames per second can introduce video stuttering and sometimes cause playback interruptions for all your viewers.
* Consistent non-zero audio bitrate is important for uninterrupted listening. A good encoder always creates a constant non-zero bitrate even when no person is speaking, or no music is being played. A varying audio bitrate can result in a bad listening experience and sometimes a good indicator for Audio-Video sync problems.
* Like Audio, a consistent average video bitrate is equally important for a good viewing experience. A varying video bitrate does not necessarily cause a playback problem but could result in a bad viewing experience.
  * A low variance in the video bitrate typically means optimal network bandwidth availability and encoder hardware resource utilization.
  * A high variance in the video bitrate indicates that the encoder hardware cannot keep up with the encoding load. Try reducing the video bitrate and using a constant bitrate (CBR) for a more reliable live stream input. Alternatively, you can also switch to another encoder like [OBS, Wirecast, etc](/docs/guides/configure-broadcast-software#software-encoders).
  * An unstable/unreliable network bandwidth availability results in transient video bitrate drops, which can cause playback interruptions.

<Callout type="success">
  No actions required.
</Callout>

## Unhealthy live stream

Now let's look at a few examples of live stream issues and potential next steps for resolution.

## Example 1: High video bitrate variance

<Image src="/docs/images/unhealthy-live-stream-1.png" width={2322} height={722} alt="Unhealthy live stream high video bitrate variance" />

Because of the constant frames per second and audio bitrate this live stream looks good, but the high variance of video bitrate and drop in the average video bitrate mid-stream can impact the viewer experience.

<Callout type="warning">
  # Use lower and constant video bitrate

  Configure your encoder to use a lower video bitrate and a constant video bitrate. Recommended encoder settings are [available here](/docs/guides/configure-broadcast-software#recommended-encoder-settings).
</Callout>

## Example 2: Intermittent loss

<Image src="/docs/images/unhealthy-live-stream-2.png" width={2322} height={722} alt="Unhealthy live stream intermittent loss" />

Mux is receiving mostly constant frames per second and audio/video bitrate. This indicates that when the encoder is connected the stream is healthy. However the small spikes as well as intermittent loss in receiving the live stream, indicates transient network bandwidth availability issues.

<Callout type="error">
  Try switching to a more reliable network and/or stop other network bandwidth consuming services for the duration of the live stream.
</Callout>

## Example 3: Spiky audio and video bitrate

<Image src="/docs/images/unhealthy-live-stream-3.png" width={2340} height={718} alt="Unhealthy live stream spiky audio and video bitrate" />

There is a high variance in receiving audio and video bitrate in this example. Because connection never fully drops the network connection is probably not the problem in this one. More likely is that the encoder is unable to keep up at a fast enough pace to send consistent video and audio data. One cause of this is that the device running the computer might be running out of available CPU.

<Callout type="error">
  Consider using any of these [recommended encoders](/docs/guides/configure-broadcast-software#software-encoders) for your live stream.
</Callout>

<Callout type="warning" title="Use lower and constant video bitrate">
  Configure your encoder to use a lower video bitrate and a constant video bitrate. Recommended encoder settings are [available here](/docs/guides/configure-broadcast-software#recommended-encoder-settings).
</Callout>

## Example 4: Spiky frame rate

<Image src="/docs/images/unhealthy-live-stream-4.png" width={2296} height={728} alt="Unhealthy live stream spiky frame rate" />

This is a good example of a very unhealthy live stream. There is high variance in the video bitrate and several instances of the frame rate dipping to nearly zero. The spiky video bitrate mid-stream indicates that the encoder is optimizing the video encoding based on the feed contents. This is not ideal for live streaming.

<Callout type="error">
  Try switching to a more reliable network and/or stop other network bandwidth consuming services for the duration of the live stream.
</Callout>

<Callout type="error" title="Use constant video bitrate">
  Configure your encoder to use a constant video bitrate. Recommended encoder settings are [available here](/docs/guides/configure-broadcast-software#recommended-encoder-settings).
</Callout>

## Integrate Live Stream Input Health data

<Callout type="info">
  Please note, this feature is only available to customers who have subscribed to this feature. [Contact our Sales team](https://www.mux.com/sales-contact) if you would like more information.
</Callout>

Live Stream Input Health data can be integrated with an Amazon Kinesis or Google Pub/Sub endpoint in your cloud account. Health and encoding metadata are sent to Kinesis or Pub/Sub as the events occur and are made available to retrieve from the stream with the same five second interval as the Dashboard.

Each message is either a Live Stream input health update or an metadata update from the encoder. The data can be stored in your long-term storage for immediate display and historical reporting.

This method of access is most useful for customers who want to embed live stream health in a user-facing application feature or need to build an internal operational tool for stream reporting.

## Setting up a streaming export

Streaming exports can be configured in the **Streaming Exports** settings in your Mux dashboard. See the setup guide for your platform for more information on setting up an export:

* [Amazon Kinesis Data Streams](/docs/guides/export-amazon-kinesis-data-streams)
* [Google Cloud Pub/Sub](/docs/guides/export-google-cloud-pubsub)

## Message Format

Messages are formatted using Protobuf (proto2) encoding. Every message uses the `live_stream_input_health.v1.LiveStreamInputHealth` message type defined in the export Protobuf spec.

The protobuf definition for the Live Stream Input Health is available in the [mux-protobuf repository](https://github.com/muxinc/mux-protobuf/tree/main/live_stream_input_health/v1). Please subscribe to this repository for updates to the protobuf definition.

There are two types of updates that can be specified, though new types may be added in the future. Each message contains one type of update:

* Encoder metadata sent by the RTMP encoder
* Stream Input Health data

The following are descriptions of the data provided by each type of update:

```javascript
RTMPMetadataEvent = {
  // Unless otherwise specified, all the data contained in `video_track` and `audio_track` is as
  // specified by the encoder (not as observed).
  "video_track": {                           // Video track, present for AV streams
      "width": 1280,                         // Width of the input video
      "data_rate": 4000,                     // Kbps data rate of the video
      "codec_id": "avc1",                    // Video codec
      "height": 720,                         // Height of the input video
      "frame_rate": 30                       // Number of frames per second
  },
  "audio_track": {                           // Audio track, present for AV and audio-only streams
      "sample_size": 16,                     // Bits per audio sample
      "sample_rate": 44100,                  // Sample rate
      "data_rate": 128,                      // Kbps data rate of the audio
      "codec_id": "mp4a",                    // Audio codec
      "channel_count": 1                     // Number of audio channels
  },
	"encoder": "ffmpeg",                       // The encoder used to transcode for the broadcast

  "live_stream_id": "uiwe7gZtIcuyYSCfjfpGjad02RPqN", // The Mux Live Stream Id for live stream
  "asset_id": "hfye6sBqRmR8MRJZaWYq602X1rB0"         // The Mux Asset Id for the asset where the input stream is stored
}
```

```javascript
HealthUpdateEvent = {
  "video_tracks": [                        // Video tracks, present for AV streams
    {
      "bytes_received": 3155737,           // Number of video bytes received data during this interval
      "stream_start_ms": 4979091,          // Timestamp of the first video frame in this interval, as measured in milliseconds since start of the stream
      "stream_end_ms": 4985097,            // Timestamp of the last video frame in this interval, as measured in milliseconds since start of the stream
      "keyframes_received": 3,             // Number of keyframes that occurred during this interval
      "total_frames_received": 180         // Total number of video frames received during this interval
    }
  ],
  "audio_tracks": [                        // Audio tracks, present for AV and audio-only streams
    {
      "bytes_received": 94864              // Number of audio bytes received from the encoder during this interval
    }
  ],
  "caption_tracks": [                      // Caption tracks
    {
      "bytes_received": 12354,             // Number of captions bytes received from the encoder during this interval
      "channel_count": 1                   // Number of captions channels that received data during this interval
    }
  ],

  "measurement_start_ms": 1644313838000,       // Timestamp of the start of the interval in milliseconds since Unix epoc
  "measurement_end_ms": 1644313838000,         // Timestamp of the end of the internval in milliseconds since Unix epoc

  "live_stream_id": "uiwe7gZtIcuyYSCfjfpGjad02RPqN",   // The Mux Live Stream Id for live stream
  "asset_id": "hfye6sBqRmR8MRJZaWYq602X1rB0",           // The Mux Asset Id for the asset where the input stream is stored
  "asn": 25135,                            // The ASN number for the ingest IP address
  "asn_name": "VODAFONE_UK_ASN (AS2135)"   // The friendly name associated with the ASN number
}
```

### Update Frequency

* Encoder metadata is sent when the RTMP stream connects to Mux. Some encoders also send metadata updates during the live stream.
* Live Stream Input Health updates occur every 5 seconds for each stream that is currently connected.


# Add your own live closed captions
Learn how to add your own closed captions to your live stream for accessibility.
## Why are closed captions important?

Closed captions refers to the visual display of the audio in a program. Closed captions make video more accessible to people who are deaf or hard of hearing, but the benefits go beyond accessibility. Closed captions empower your viewers to consume video content in whichever way is best for them, whether it be audio, text, or a combination.

## Supported live caption formats

There are many types of closed caption sources, and each streaming standard may use a different format for embedding captions on the output. Mux supports receiving closed captions embedded in the H.264 video stream using the CEA-608 standard for a single language.

CEA-608 stems from the analog era where closed captions data was carried directly in the transmission in a line of the video content that wasn’t displayed unless the decoder was told to look for it. These were often referred to as “Line 21” captions. CEA-608 is still the primary standard for transmitting closed captions within the same stream as audio/video content.

Most major live caption providers (e.g. AI-Media, EEG Falcon, 3Play, Verbit) will support the CEA-608 standard. Mux will translate the CEA-608 captions into WebVTT that will be delivered as part of the HLS stream/manifest, in a standard HLS-supported manner. We will continue to evaluate demand for supporting captions for multiple languages and other caption formats.

## Integrate your own closed captions

Add the `embedded_subtitles` array at time of stream creation or to an existing live stream. Closed captions are a type of subtitle. The resulting Asset's subtitle text track will have `closed_captions: true` set.

| Input Parameters | Type   | Description                                                                                                                                                                                    |
| ---------------- | ------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| name             | string | The name of the track containing a human-readable description. This value must be unique across all the text type and subtitles text type tracks. Defaults to `language_code` if not provided. |
| passthrough      | string | Arbitrary metadata set for the live closed caption track. Max 255 characters.                                                                                                                  |
| language\_code    | string | The language of the closed caption stream. Value must be BCP 47 compliant. Defaults to `en` if not provided                                                                                    |
| language\_channel | string | CEA-608 caption channel to read caption data from. Possible values: "cc1"                                                                                                                      |

### Step 1A: Create a live stream in Mux

Create a live stream using the <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">Live Stream Creation API</ApiRefLink>. Let Mux know that closed captions will be embedded in the RTMP stream at time of live stream creation.

#### API Request

```json
POST /video/v1/live-streams

Request Body
{
  "playback_policy" : [
    "public"
  ],
  "embedded_subtitles" : [
    {
      "name": "English CC",
      "passthrough": "English closed captions",
      "language_code": "en-US",
      "language_channel" : "cc1"
    }
  ],
  "new_asset_settings" : {
    "playback_policy" : [
      "public"
    ]
  }
}
```

#### API Response

```json
{
  "data": {
    "stream_key": "5bd28537-7491-7ffa-050b-bbb506401234",
    "playback_ids": [
      {
        "policy": "public",
        "id": "U00gVu02hfLPdaGnlG1dFZ00ZkBUm2m0"
      }
    ],
    "new_asset_settings": {
      "playback_policies": ["public"]
    },
    "embedded_subtitles": [
      {
        "name": "English CC",
        "passthrough": "English closed captions",
        "language_code": "en-US",
        "language_channel": "cc1"
      }
    ],
    "id": "e00Ed01C9ws015d5SLU00ZsaUZzh5nYt02u",
    "created_at": "1624489336"
  }
}
```

### Step 1B: Configure live closed captions for an existing live stream

Use the Live Stream Closed Captions API to configure closed captions to an existing live stream. Live closed captions can not be configured to an active live stream.

#### API Request

```json
PUT / video/v1/live-streams/{live_stream_id}/embedded-subtitles

Request Body
{
  "embedded_subtitles": [
    {
      "name": "en-US",
      "language_code": "en-US",
      "language_channel": "cc1"
    }
  ]
}
```

#### API Response

```json
Response
{
  "data": {
    "stream_key": "5bd28537-7491-7ffa-050b-bbb506401234",
    "playback_ids": [
      {
        "policy": "public",
        "id": "U00gVu02hfLPdaGnlG1dFZ00ZkBUm2m0"
      }
    ],
    "new_asset_settings": {
      "playback_policies": [
        "public"
      ]
    },
    "embedded_subtitles" : [
      {
        "name": "English",
        "language_code": "en-US",
        "language_channel": "cc1"
      }
    ],
    "id": "e00Ed01C9ws015d5SLU00ZsaUZzh5nYt02u",
    "created_at": "1624489336"
  }
}
```

### Step 2: Create an event with your preferred closed caption vendor

Log into your preferred closed caption provider account (e.g. AI-Media, 3Play, Verbit) and create an event that needs to be captioned. To create an event, you will need to provide the following inputs

* Start date and time
* Language of audio to be captioned
* Destination Stream URL and Stream Key (Mux). The caption vendor will send video with captions encoded via the 608 standard to this destination.

| RTMP Server URL                     | Description                                                                                                          | Common Applications                                                                                                         |
| ----------------------------------- | -------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------- |
| rtmp://global-live.mux.com:5222/app | Mux's standard RTMP ingest URL. Compatible with the majority of streaming applications and services.                | Open Source RTMP SDKs, most [app-store streaming applications](https://mux.com/blog/guide-to-rtmp-broadcast-apps-for-ios/). |
| rtmps://global-live.mux.com:443/app | Mux's secure RTMPS ingest URL. Compatible with fewer streaming applications, but offers a higher level of security. | OBS, Wirecast, Streamaxia RTMP SDKs                                                                                         |

Mux's global RTMP or RTMPS ingest urls will connect you to the closest ingest region. While these ingest URLs typically provide optimal performance, you can also select a specific region using our [regional ingest URLs.](/docs/guides/configure-broadcast-software#available-ingest-urls).

Upon successful event creation, the closed caption provider will provide the following

* Stream URL
* Stream Key

Learn more about:

* [How to setup live captions with AI-Media EEG Falcon](https://www.ai-media.tv/wp-content/uploads/Manual_Falcon-User-Guide-.pdf)
* [How to setup live captions with 3Play Media](https://support.3playmedia.com/hc/en-us/articles/360048839533-Live-Captions-Schedule-Live-Captions-for-an-RTMP-Stream)
* [How to setup live captions with Verbit](https://verbit-ai.zendesk.com/hc/en-us/articles/4403013880594-Verbit-s-RTMP-Solution-for-Livestreaming-Events)

### Step 3: Point your RTMP stream to your caption provider

Configure your video encoder with the Stream URL and Stream Key provided by the closed caption provider in Step 2.

### Step 4: Start your live stream

When the stream goes live, a new live asset is created and tracks will be created for the corresponding captions.

### Step 5: Monitor closed caption stream health

When your stream is live, visit the Live Health Dashboard to monitor closed caption stream health. The dashboard will show whether Mux is receiving closed captions. More details can be found at [Debug live stream issues](/docs/guides/debug-live-stream-issues)

## Update stream to not expect live captioning for future connections

Let Mux know to not expect closed captions when the live stream starts again. This can be done by configuring your live stream to not have any captions. This request can only be made while the live stream is idle.

### API Request

```json
PUT /video/v1/live-streams/{live_stream_id}/embedded-subtitles

Request Body
{
  "embedded_subtitles" : []
}
```

## FAQ

### Are there any language restrictions?

Yes. The 608 standard only supports the following languages: English, Spanish, French, German, Dutch, and Portuguese, or Italian. We currently only support live closed captions for a single language. We will evaluate supporting multiple languages based off of customer feedback.

### Is the 608 standard supported by my closed caption vendor?

Caption vendors known to support the 608 captions: 3Play, AI-Media EEG Falcon, Verbit

Caption vendors known to not support 608 captions: Rev.ai

### When can I edit my live closed caption configuration?

You can only edit your live caption configuration while the live stream is idle; you cannot make any changes while the live stream is active.

### Will formatting be preserved?

Mux will translate the CEA-608 captions into WebVTT. Though Mux attempts to preserve the caption formatting, some formatting may be lost.

### Does live captions work with audio-only?

No. If you have a use case for this, please let us know.

### How do I download my closed caption track?

```json
https://stream.mux.com/{PLAYBACK_ID}/text/{TRACK_ID}.vtt
```

More details can be found at [Advanced Playback features](/docs/guides/play-your-videos#advanced-playback-features)

### Does live closed captions work with low latency?

Not at this time. If you have a use case for this, please let us know.


# Add Auto-Generated Live Captions
In this guide you will learn how to add auto-generated live captions to your Mux live stream.
## Overview

Mux is excited to offer auto-generated live closed captions in English, French, German, Italian, Portuguese, and Spanish. Closed captions make video more accessible to people who are deaf or hard of hearing, but the benefits go beyond accessibility. Captions empower your viewers to consume video content in whichever way is best for them, whether it be audio, text, or a combination.

For auto-generated live closed captions, we use artificial intelligence based speech-to-text technology to generate the closed captions. Closed captions refer to the visual display of the audio in a program.

## 1. Is my content suitable for auto-generated live closed captions?

Non technical content with clear audio and minimal background noise is most suitable for auto-generated live captions. Content with music and multiple speakers speaking over each other are not good use cases for auto-generated live captions.

<Callout type="info">Accuracy ranges for auto-generated live captions range from 70-95%.</Callout>

## 2. Increase accuracy of captions with transcription vocabulary

For all content, we recommend you provide transcription vocabulary of technical terms (e.g. CODEC) and proper nouns. By providing the transcription vocabulary beforehand, you can **increase the accuracy** of the closed captions.

The transcription vocabulary helps the speech to text engine transcribe terms that otherwise may not be part of general library. Your use case may involve brand names or proper names that are not normally part of a language model’s library (e.g. "Mux"). Or perhaps you have a term, say "Orchid" which is a brand name of a toy. The engine will recognize "orchid" as a flower but you would want the word transcribed with proper capitalization in the context as a brand.

Please note that it can take up to 20 seconds for the transcription vocabulary to be applied to your live stream.

## 3. Create a new transcription vocabulary

You can create a new transcription library by making a `POST` request to `/video/v1/transcription-vocabularies` endpoint API and define the input parameters. Each transcription library can have up to 1,000 phrases.

## Request Body Parameters

| Input parameters | Type     | Description                                                                                                                                           |
| ---------------- | -------- | ----------------------------------------------------------------------------------------------------------------------------------------------------- |
| name             | `string` | The human readable description of the transcription library.                                                                                          |
| phrases          | `array`  | An array of phrases to populate the transcription library. A phrase can be one word or multiple words, usually describing a single object or concept. |

### API Request

```json
POST /video/v1/transcription-vocabularies
{
  "name": "TMI vocabulary",
  "phrases": ["Mux", "Demuxed", "The Mux Informational", "video.js", "codec", "rickroll"]
}
```

### API Response

```json
{
  "data": {
    "updated_at": "1656630612",
    "phrases": ["Mux", "Demuxed", "The Mux Informational", "video.js", "codec", "rickroll"],
    "name": "TMI vocabulary",
    "id": "4uCfJqluoYxl8KjXxNF00TgB56OyM152B5ZR00cLKXFlc",
    "created_at": "1656630612"
  }
}
```

## 4. Enable auto-generated live closed captions

Add the `generated_subtitles` array at time of stream creation or to an existing live stream.

## Request Body Parameters

| Input parameters               | Type     | Description                                                                                                                                                                                                                 |
| ------------------------------ | -------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `name`                         | `string` | The human readable description for the generated subtitle track. This value must be unique across all the text type and subtitles text type tracks. If not provided, the name is generated from the chosen `language_code`. |
| `passthrough`                  | `string` | Arbitrary metadata set for the generated subtitle track.                                                                                                                                                                    |
| `language_code`                | `string` | BCP-47 language tag for captions. Defaults to `"en"`.                                                                       |
| `transcription_vocabulary_ids` | `array`  | The IDs of existing Transcription Vocabularies that you want to be applied to the live stream. If the vocabularies together contain more than 1,000 unique phrases, only the first 1,000 will be used.                      |

Mux supports the following languages and corresponding language codes for live generated captions.

| Language | Language Code |
| :-- | :-- |
| English | `"en"` |
| Spanish | `"es"` |
| Italian | `"it"` |
| Portuguese | `"pt"` |
| German | `"de"` |
| French | `"fr"` |

<Callout type="info">Locale codes such as `"en-US"` or `"es-MX"` are accepted and will be parsed down to their language code (e.g., `"en-US"` → `"en"`).</Callout>

# Step 1A: Create a live stream in Mux

Create a live stream using the <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">Live Stream Creation API</ApiRefLink>. Let Mux know that you want auto-generated live closed captions.

### API Request

```json
POST /video/v1/live-streams

Request Body
{
  "playback_policy" : ["public"],
  "generated_subtitles": [
    {
      "name": "English CC (auto)",
      "passthrough": "English closed captions (auto-generated)",
      "language_code": "en",
      "transcription_vocabulary_ids": ["4uCfJqluoYxl8KjXxNF00TgB56OyM152B5ZR"]
    }
  ],
  "new_asset_settings" : {
    "playback_policy" : ["public"]
  }
}
```

### API Response

```json
Response
{
  "data": {
    "stream_key": "5bd28537-7491-7ffa-050b-bbb506401234",
    "playback_ids": [
      {
        "policy": "public",
        "id": "U00gVu02hfLPdaGnlG1dFZ00ZkBUm2m0"
      }
    ],
    "new_asset_settings": {
      "playback_policies": [
        "public"
      ]
    },
    "generated_subtitles" : [
      "name": "English CC (auto)",
      "passthrough": "English closed captions (auto-generated)",
      "language_code": "en",
      "transcription_vocabulary_ids": ["4uCfJqluoYxl8KjXxNF00TgB56OyM152B5ZR"]
    ],
    "id": "e00Ed01C9ws015d5SLU00ZsaUZzh5nYt02u",
    "created_at": "1624489336"
  }
}

```

# Step 1B: Configure live captions for an existing live stream

Use the Generated Subtitles API to configure generated closed captions to an existing live stream. Live closed captions can not be configured to an active live stream.

### API Request

```json
PUT /video/v1/live-streams/{live_stream_id}/generated-subtitles

Request Body
{
  "generated_subtitles": [
    {
      "name": "English CC (auto)",
      "passthrough": "{\"description\": \"English closed captions (auto-generated)\"}",
      "language_code": "en",
      "transcription_vocabulary_ids": ["4uCfJqluoYxl8KjXxNF00TgB56OyM152B5ZR"]
    }
  ]
}
```

### API Response

```json
Response
{
  "data": {
    "stream_key": "5bd28537-7491-7ffa-050b-bbb506401234",
    "playback_ids": [
      {
        "policy": "public",
        "id": "U00gVu02hfLPdaGnlG1dFZ00ZkBUm2m0"
      }
    ],
    "new_asset_settings": {
      "playback_policies": [
        "public"
      ]
    },
    "generated_subtitles": [
      {
        "name": "English CC (auto)",
        "passthrough": "{\"description\": \"English closed captions (auto-generated)\"}",
        "language_code": "en",
        "transcription_vocabulary_ids": ["4uCfJqluoYxl8KjXxNF00TgB56OyM152B5ZR"]
      }
    ]
  }
}
```

# Step 2: Start your live stream

* At the start of the Live Stream, two text tracks will be created for the active asset, with `text_source` attributes of `generated_live` and `generated_live_final`, respectively.

* While the stream is live, the `generated_live` track will be available and include predicted text for the audio.

* At the end of the stream, the `generated_live_final` track will transition from the preparing to ready state; this track will include finalized predictions of text and result in higher-accuracy, better-timed text.

* After the live event has concluded, the playback experience of the asset created will only include the more accurate `generated_live_final` track, but the sidecar VTT files for both tracks will continue to exist.

## 5. Update stream to not auto-generate closed captions for future connections

To prevent future connections to your live stream from receiving auto-generated closed captions, update the `generated_subtitles` configuration to `null` or an empty array.

### API Request

```json
PUT /video/v1/live-streams/{live_stream_id}/generated-subtitles

Request Body
{
  "generated_subtitles" : []
}
```

## 6. Manage and update your transcription vocabulary

### Update phrases in a transcription vocabulary

Phrases can be updated at any time, but won't go into effect to active live streams with auto-generated live closed captions enabled where the transcription vocabulary has been applied. If the updates are applied to an active live stream, they will not be applied until the next time the stream is active.

### API Request

```json
PUT /video/v1/transcription-vocabularies/$ID
{
  "phrases": ["Demuxed", "HLS.js"]
}
```

## FAQs

### What happens if my live stream has participants speaking languages other than the caption stream I've chosen?

If you've added an auto-generated English caption stream and your audio contains a non-English language, we will attempt to auto-generate captions for all the content in English. e.g. If French and English are spoken, we will create captions for the French language content using the English model and the output would be incomprehensible.

### When can I edit my live caption configuration?

Only when the live stream is idle. You cannot make any changes while the live stream is active.

### How do I download my auto-generated closed caption track?

```json
https://stream.mux.com/{PLAYBACK_ID}/text/{TRACK_ID}.vtt
```

More details can be found at [Advanced Playback features](/docs/guides/play-your-videos#advanced-playback-features)

### Do live captions work with low latency live streams?

Not at this time.


# Live Streaming FAQs
Answers to common questions relating to Live Streaming.
## What is Mux’s latency for live streams?

For a standard live stream, latency is expected to be greater than 20 seconds and typically about 25 - 30 seconds. We offer a
[reduced latency](https://mux.com/blog/reduced-latency-for-mux-live-streaming-now-available/) mode which will reduce latency to about 12 - 20 seconds.
A low-latency live stream can go as low as 5 seconds of glass-to-glass latency but the latency can vary depending on your viewer's geographical
location and internet connectivity.

## Does Mux support WebRTC for live streaming ingest?

We currently do not support direct WebRTC ingest to a Live Stream.

We have focused on RTMP and RTMPS for live streams from an encoder as they are the most universal ingest protocols.

## How can I go live from a browser?

To [go live directly from a browser](https://mux.com/blog/the-state-of-going-live-from-a-browser/), you need to convert the browser stream into a format that can be consumed by RTMP on our end.

For example, we’ve had customers use Zoom to provide the video from a browser, and use its RTMP-out feature to broadcast a stream with Mux for streaming to a larger [conference-like audience](https://mux.com/blog/how-to-host-your-own-online-conference/).

## Is Mux Live Streaming API suitable for 2-way video communication applications?

Mux's Live Streaming API is not intended to provide 2-way video communication. For use cases that require 2-way video communication, we'd suggest looking at one of our partners, [LiveKit](https://livekit.io/).

## Is it possible to rewind live content while the live stream continues?

**DVR (Digital Video Recorder) mode** is a live stream feature that lets it rewind. Mux supports DVR and non-DVR modes for live streams.

**non-DVR mode** is enabled by default for live streams and only has access to the most recent 30 seconds of the live stream.

**DVR mode** is possible by utilizing the live stream's `active_asset_id`. When constructing the playback URL, the `playback_id` for the associated `active_asset_id` is used. When the live video ends, the <ApiRefLink href="/docs/api-reference/video/playback-id">Playback ID</ApiRefLink> associated with the `active_asset_id` will automatically transition to an on-demand asset for playback instead.

For more information and caveats behind these two modes, refer to our [Stream recordings of live streams](/docs/guides/stream-recordings-of-live-streams) guide.

## What is the maximum live stream duration?

Currently, we have a 12 hour limit for continuous streaming to our live endpoints. The live stream is disconnected after 12 hours.

If the encoder reconnects, Mux will transition to a new asset with its own playback ID.

## Do I need to create stream keys for every live event?

No, stream keys can be re-used as many time as you want. It's common for applications to assign one stream key to each user (broadcaster) in their system and allow that user to re-use the same stream key over time.

## Is there a limit to creating stream keys and live steams?

There is no limit on how many stream keys and live streams you can create.

Once created, stream keys are persistent and can be used for any number of live events.

## Do you charge for creating stream keys?

We don’t charge for creating stream keys, only when sending us an active RTMP feed.

## Can I live stream a pre-recorded video?

Mux does not support generating simulated live from on-demand assets. Such a service is also called a "Playout service".

However, you can run a simulated live stream using a tool like [OBS](https://obsproject.com/) and Wirecast to send your on-demand asset to us as an RTMP stream. See how to configure your RTMP encoder on our [Configuring Broadcast Software docs page](/docs/guides/configure-broadcast-software).

For a more comprehensive guide and common options we recommend for work-arounds, see this guide of how to [Stream simulated live](/docs/guides/stream-simulated-live).

## Can I restream/simulcast my live stream to social platforms like Facebook?

Yes. Mux Video live service supports up to six simultaneous restreams to third party platforms that support RTMP feed.

Read more in this blog post: [Help Your Users be in 5 Places at Once: Your Guide to Simulcasting](https://mux.com/blog/help-your-users-be-in-5-places-at-once-your-guide-to-simulcasting/).

## Is my content saved after the live broadcast is over?

Yes. Mux will automatically create an <ApiRefLink href="/docs/api-reference/video/assets">on-demand (VOD) asset</ApiRefLink> after your live stream ends, which can be streamed again instantly after the live stream ends.

## Can I get access to my live event's recording?

Yes, you can enable downloading of the entire event recording using [Master access](/docs/guides/download-for-offline-editing) feature.

With Master access enabled, you will receive a Webhook notification after the live stream ends, indicating that the master copy of the video asset is available to download.

## Can I generate thumbnails/GIFs while the live stream is active?

Yes. You can use our [thumbnail and animated GIF API](/docs/guides/get-images-from-a-video) while the live event is active.

Many customers use thumbnails or GIFs to show what content is currently playing or to as a way to promote the live stream.

## Can I test Mux live streaming for free?

On any paid plan, you can create [free test live streams](https://mux.com/blog/new-test-mux-video-features-for-free/) to help evaluate the Mux Video APIs without incurring any cost.

We give you access to create an unlimited number of test live streams. Test live streams are watermarked with the Mux logo, limited to 5 minutes, and disabled after 24 hours.

## Can I add multiple audio channels or tracks to my live stream?

No, we currently support only one audio track for live streams. On-demand video assets do support [multiple alternative audio tracks](/docs/guides/add-alternate-audio-tracks-to-your-videos)

You may want your users to be able to select a language on the player and view a stream showing the same video content but play different audio. One workaround would be first, ingest multiple streams with one in each language. Then add logic to the player to switch between different playback URLs and the complete stream when the user changes the language.

## What happens if I live stream variable frame rate (VFR) content?

While Mux does not output variable frame rate (VFR) content for live streams, we will accept variable frame rate (VFR) content for ingest. Having said that, we recommend using constant frame rate (CFR) content for live streams to ensure the best playback experience.


# Filter your Data
Learn how to use the `filters[]` parameter to filter your data with flexible syntax for different dimension types.
The `filters[]` parameter allows you to filter your data using flexible syntax that supports different types of operations depending on the dimension type.

## Filter Syntax Overview

The basic format for all filters is:

<code>
  filters\[]=<span className="bg-yellow text-white">\<operation></span><span className="bg-orange text-white">\<dimension></span>:<span className="bg-pink text-white">\<value></span>
</code>

* <span className="bg-yellow text-white font-mono">\<operation></span> is the optional prefix that defines the type of filter.
  Examples:
  * (none) → equals → `country:US`
  * `!` → not equals → `!country:US`
  * `+` → set contains → `+video_cdn_trace:fastly`
  * `-` → set omits → `-video_cdn_trace:cloudflare`

* <span className="bg-orange text-white font-mono">\<dimension></span> is the field or metric you want to filter on.
  Examples: `country`, `operating_system`, `video_cdn_trace`

* **`:`** acts as the separator between the dimension and the value.

* <span className="bg-pink text-white font-mono">\<value></span> is the value you're comparing against.
  Examples:
  * Scalar → `US`, `windows`
  * Trace → `[fastly,akamai]`
  * Empty trace → `[]`

## Supported Operations

### Scalar Operations

Scalar operations can be used with single-value dimensions or simple key-value pairs. Use these operations when you want to filter by an exact match or exclusion.

| Syntax | Operation | Example | Description |
| --- | --- | --- | --- |
| `dimension:value` | Equals | `filters[]=country:US` | Field equals value |
| `!dimension:value` | Not equals | `filters[]=!operating_system:windows` | Field does not equal value |

### Set Operations

Use for trace dimensions that can have multiple values in an ordered list. Use these operations when you want to check if a single value appears in the trace dimension.

| Syntax | Operation | Example | Description |
| --- | --- | --- | --- |
| `+dimension:value` | Has | `filters[]=+video_cdn_trace:fastly` | Set contains value |
| `-dimension:value` | Omits | `filters[]=-video_cdn_trace:cloudflare` | Set does NOT contain value |

Please note that set operations cannot be used as a wildcard for substring searches. For example, `filters[]=+video_cdn_trace:fas` *cannot* be used to return views with CDN traces that contain `fastly`.

### Trace Operations

Use for trace dimensions that can have multiple values in an ordered list. Use this operation when you want to filter for an exact, ordered match.

| Syntax | Operation | Example | Description |
| --- | --- | --- | --- |
| `dimension:[value1,value2]` | Equals | `filters[]=video_cdn_trace:[fastly,akamai]` | Trace equals exactly `[fastly, akamai]` |

## Practical Examples

### Scalar (Basic) Operations

Filter for views from the US:

```
filters[]=country:US
```

Exclude mobile operating systems:

```
filters[]=!operating_system:mobile
```

### Set Operations

Find views with 'fastly' as a CDN value in the video\_cdn\_trace dimension.

```
filters[]=+video_cdn_trace:fastly
```

Exclude views that went through `cloudflare`:

```
filters[]=-video_cdn_trace:cloudflare
```

### Trace Operations

Find views that went through exactly `fastly` first, then `akamai`:

```
filters[]=video_cdn_trace:[fastly,akamai]
```

Find views where no CDN value was set:

```
filters[]=video_cdn_trace:[]
```

### Multiple Filters

You can combine multiple filters.

**Filters with different dimensions**

When you are combining filters with different dimensions they are combined with AND.

```
# Views from US AND went through fastly
filters[]=country:US
filters[]=+video_cdn_trace:fastly
```

You can also combine dimensions with the same dimension. These are combined with OR.

```
# Views from US OR Canada
filters[]=country:US
filters[]=country:CA
```

However, if you are combining filters with the same dimension value with a negated value, those are combined using AND.

```
# Views NOT from US AND NOT from Canada
filters[]=!country:US
filters[]=!country:CA
```

### Value Formatting

* **Scalar (basic) values**: Plain strings (`country:US`)
* **Trace values**: Comma-separated values in brackets (`video_cdn_trace:[a,b,c]`)
* **Empty traces**: Use empty brackets (`video_cdn_trace:[]`)

## Common Errors

❌ **Don't use brackets with set operators:**

```
filters[]=+video_cdn_trace:[fastly]  # Invalid
```

✅ **Correct:**

```
filters[]=+video_cdn_trace:fastly    # Valid
```

❌ **Don't use scalar operator syntax with trace dimensions:**

```
filters[]=video_cdn_trace:fastly  # Invalid
```

✅ **Use trace or set operator syntax with trace dimensions:**

```
filters[]=video_cdn_trace:[fastly]  # Exact match
filters[]=+video_cdn_trace:fastly   # Contains check
```

## URL Encoding

When using filters in URLs, remember to properly encode the parameters:

```bash
# Single filter
/metrics?filters[]=country:US

# Multiple filters
/metrics?filters[]=country:US&filters[]=+tags:beta&filters[]=video_cdn_trace:[fastly,akamai]

# URL encoded
/metrics?filters%5B%5D=country%3AUS&filters%5B%5D=%2Btags%3Abeta&filters%5B%5D=video_cdn_trace%3A%5Bfastly%2Cakamai%5D
```

Here's an example of how you can URL encode the filter params using JavaScript:

```javascript
// Example filters
const filters = [
  "country:US",
  "+tags:beta",
  "video_cdn_trace:[fastly,akamai]"
];

// Use URLSearchParams to build query string
const params = new URLSearchParams();
filters.forEach(f => params.append("filters[]", f));

// Full URL
const url = `/metrics?${params.toString()}`;

console.log(url);
// /metrics?filters%5B%5D=country%3AUS&filters%5B%5D=%2Btags%3Abeta&filters%5B%5D=video_cdn_trace%3A%5Bfastly%2Cakamai%5D
```

## Common Use Cases

### Analytics and Debugging

#### Find problematic CDN paths:

```
# Views that went through cloudflare but not fastly
filters[]=-video_cdn_trace:fastly
filters[]=+video_cdn_trace:cloudflare
```

#### Debug specific video delivery paths:

```
# Exact CDN sequence analysis
filters[]=video_cdn_trace:[fastly,akamai,cloudfront]
```

### Performance Analysis

#### High-performance regions:

```
# Exclude slow CDN providers
filters[]=-video_cdn_trace:slow-cdn
filters[]=!operating_system:legacy
```

#### Mobile vs Desktop comparison:

```
# Mobile traffic analysis
filters[]=operating_system:ios
filters[]=operating_system:android
```

### Content Filtering

#### Live vs VOD content:

```
# Exclude recorded content
filters[]=!content_type:recorded
filters[]=content_type:live
```

#### Platform-specific analysis:

```
# Web platform only, excluding mobile apps
filters[]=platform:web
filters[]=!platform:ios
filters[]=!platform:android
```

## Error Handling

The API will return validation errors for:

* Invalid dimension names
* Incorrect operator usage for dimension type
* Malformed values (e.g., mismatched brackets, quotes)
* Invalid operator combinations (e.g., `!+dimension:value`)

Example error response:

```json
{
  "error": "Sequence dimensions require bracket notation. Use video_cdn_trace:[value] instead of video_cdn_trace:value"
}
```

## Advanced Tips

### Testing Your Filters

To ensure your filters are working as expected, it can be helpful to limit the dataset you're working with. For that reason, you may wish to test your filters with a small date range first:

```bash
/metrics?timeframe[]=24:hours&filters[]=country:US&filters[]=+video_cdn_trace:akamai
```


# Stream export data to an Amazon Kinesis data stream
Learn how to send streaming exports data from Mux to Amazon Kinesis Data Streams.
<Callout type="info">
  Streaming Exports are available on **Mux Data Media** plans. Learn more about [Mux Data Plans](https://data.mux.com/pricing) or [contact support](https://mux.com/support).
</Callout>

<Callout type="info">
  For a detailed walkthrough of the Amazon Kinesis Data Streams setup process, see this [blog post](https://www.mux.com/blog/mux-amazon-kinesis-integration).
</Callout>

In order to stream exports from Mux to Amazon Kinesis Data Streams, you’ll need to set up a data stream in your AWS account. This guide covers the high-level steps required for setup.

## 1. Add a new streaming export

To add a new streaming export, go to **Settings > Streaming Exports** in your Mux dashboard. From that tab, click **New streaming export** to open the configuration modal.

Select the type of data you want to export, the environment you want to send data from, the export format, and select **Amazon Kinesis Data Streams** as the service.

## 2. Set up a data stream in Amazon Kinesis

You'll need to complete the following setup in your AWS account before you can create a new streaming export in Mux:

1. Create an Amazon Kinesis data stream.
2. Create an IAM role for Mux’s AWS account. To create the IAM role, you'll need Mux's AWS account ID and an external ID, which are shown in the configuration modal. See [this AWS user guide](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-user_externalid.html) for more information about how the external ID is used. When creating the role, choose "AWS account" for the Trusted entity type. Select "Another AWS account" and enter Mux’s AWS account ID. Check "Require external ID" and paste in the "External ID" that Mux provided to you in the configuration modal.
3. Provide the IAM role you created with write access to your data stream. Here’s an example of an IAM policy that grants the necessary permissions (replace the resource with your data stream ARN):

```json
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
          "kinesis:ListShards",
          "kinesis:PutRecord",
          "kinesis:PutRecords"
      ],
      "Resource": [
        "arn:aws:kinesis:{region}:{account-id}:stream/{stream-name}"
      ]
    }
  ]
}
```

## 3. Finish setup in Mux

In the configuration modal, provide the data stream ARN and IAM role ARN. Make sure the values you provide match these formats:

* Data stream ARN\
  `arn:aws:kinesis:{region}:{account-id}:stream/{data-stream-name}`
* IAM role ARN\
  `arn:aws:iam::{account-id}:role/{role-name}`

Click **Enable export**, and your streaming export will be activated immediately. We will start streaming views as soon as they're completed.

## Process messages

With your export set up, you can begin consuming incoming messages. For more information on the message format and processing data, see the main [Export raw Mux data](/docs/guides/export-raw-video-view-data) guide.


# Stream export data to a Google Cloud Pub/Sub topic
Learn how to send streaming exports data from Mux to Google Cloud Pub/Sub.
<Callout type="info">
  Streaming Exports are available on **Mux Data Media** plans. Learn more about [Mux Data Plans](https://data.mux.com/pricing) or [contact support](https://mux.com/support).
</Callout>

In order to stream exports from Mux to a Pub/Sub topic, you'll need to set up a topic in your Google Cloud account. Mux will write data to the topic as it becomes available. This guide covers the high-level steps required for setup.

## 1. Add a new streaming export

To add a new streaming export, go to **Settings > Streaming Exports** in your Mux dashboard. From that tab, click **New streaming export** to open the configuration modal.

Select the type of data you want to export, the environment you want to send data from, the export format, and select **Google Cloud Pub/Sub** as the service.

## 2. Set up a topic in Google Cloud Pub/Sub

You'll need to complete the following setup in your Google Cloud account before you can create a new streaming export in Mux:

1. *(Optional)* If you want to use a schema with your Pub/Sub topic, you can create one using the Protobuf spec for the data you are exporting, which is available in the [mux-protobuf repository](https://github.com/muxinc/mux-protobuf).
2. Create a Pub/Sub topic. If you're creating a topic with a schema, set the message encoding to **Binary**.
3. Add the Mux service account to the topic as a Principal with the **Pub/Sub Publisher** role. The Mux service account is shown in the configuration modal.

## 3. Finish setup in Mux

In the configuration modal, provide the Pub/Sub topic name. This should be the full topic name, including the project ID, and match the format `projects/{project-id}/topics/{topic-id}`.

Click **Enable export**, and your streaming export will be activated immediately. We will start streaming data as soon as it becomes available.

## Process messages

With your export set up, you can begin consuming incoming messages. For more information on the message format and processing data, see the main [Export raw Mux data](/docs/guides/export-raw-video-view-data) guide.


# Signing JWTs
JSON Web Tokens are an open, industry standard method for representing claims securely between two parties. Mux APIs leverage JWTs to authenticate requests.
## What is a JWT?

JWTs are made up of a header, a payload, and a signature. The header contains metadata useful for decrypting the rest of the token. The payload contains configuration options. And the signature is generated from a signing key-pair. More information can be found at [jwt.io](https://jwt.io/).

In order to sign the JWT you must create a signing key. Signing keys can be created from the [Signing Keys section](https://dashboard.mux.com/settings/signing-keys) of the Mux Dashboard or via the <ApiRefLink href="/docs/api-reference/system/signing-keys">Mux System API</ApiRefLink>. This key-pair will be used by a cryptographic function to sign JWTs.

## Signing JWTs during Development

While developing an app, you may want an easy way to generate JWTs locally because you're not yet ready to set up a full blown production system that signs JWTs for client-side applications. There are a few different options for generating these JWTs.

### Web Based JWT Signer

<Callout type="warning">
  Pasting credentials into a web browser is generally a bad practice. This web-based tool signs JWTs on the client which means your credentials never leave your machine. This is a tool designed by Mux, intended to be used with Mux credentials, and will always be hosted on a Mux domain. **Never use a tool like this if it is hosted on a non-Mux domain.**
</Callout>

Mux provides a web based JWT Signer at https://jwt.mux.dev. Simply input the Signing key-pair and configure the claims you wish to test your app with. Then, copy the JWT into your application code and run it.

<Image src="/docs/images/jwt-signer.gif" width={600} height={440} alt="Mux's JWT Signer" />

### Node based CLI

Mux provides a [Node.js based CLI](https://github.com/muxinc/cli) for performing common tasks including signing JWTs for [playback IDs](https://github.com/muxinc/cli#mux-sign-playback-id).

After [installing Node.js](https://nodejs.org/), the Mux CLI must be initialized with an Access Token. Follow [this guide](/docs/core/make-api-requests#http-basic-auth) to create an Access Token. With your newly created Access Token, initialize the Mux CLI.

```
npx @mux/cli init
```

Now that the Mux CLI is initialized with your credentials, you can sign a JWT for [Video Playback](https://github.com/muxinc/cli#mux-sign-playback-id).

```
npx @mux/cli sign PLAYBACK-ID
```

For more details, refer to https://github.com/muxinc/cli.

<Callout type="warning">
  You should only sign a JWT on the server, where you can keep your signing key secret. You should not put your signing key in the client itself.
</Callout>

<Callout type="success">
  Setup a REST endpoint behind your own authentication system that provides your client-side code with signed JWTs. That way, the sensitive secret from the signing key-pair stays on the server instead of being included in the client.
</Callout>

## Signing JWTs for Production

Once you're ready for customers to start using your app, you need a way to sign JWTs securely at-scale. Use the code examples below depending on which Mux product you would like to sign JWTs for.

### Sign Video Playback JWTs

```go

package main

import (
    "encoding/base64"
    "fmt"
    "log"
    "time"

    "github.com/golang-jwt/jwt/v4"
)

func main() {

    playbackId := "" // Enter your signed playback id here
    keyId      := "" // Enter your signing key id here
    key        := "" // Enter your base64 encoded private key here

    decodedKey, err := base64.StdEncoding.DecodeString(key)
    if err != nil {
        log.Fatalf("Could not base64 decode private key: %v", err)
    }

    signKey, err := jwt.ParseRSAPrivateKeyFromPEM(decodedKey)
    if err != nil {
        log.Fatalf("Could not parse RSA private key: %v", err)
    }

    token := jwt.NewWithClaims(jwt.SigningMethodRS256, jwt.MapClaims{
        "sub": playbackId,
        "aud": "v",
        "exp": time.Now().Add(time.Minute * 15).Unix(),
        "kid": keyId,
    })

    tokenString, err := token.SignedString(signKey)
    if err != nil {
        log.Fatalf("Could not generate token: %v", err)
    }

    fmt.Println(tokenString)
}

```

```node

// We've created some helper functions for Node to make your signing-life easier
const Mux = require('@mux/mux-node');
const mux = new Mux();

async function createTokens () {
  const playbackId = ''; // Enter your signed playback id here

  // Set some base options we can use for a few different signing types
  // Type can be either video, thumbnail, gif, or storyboard
  let baseOptions = {
    keyId: '', // Enter your signing key id here
    keySecret: '', // Enter your base64 encoded private key here
    expiration: '7d', // E.g 60, "2 days", "10h", "7d", numeric value interpreted as seconds
  };

  const token = await mux.jwt.signPlaybackId(playbackId, { ...baseOptions, type: 'video' });
  console.log('video token', token);

  // Now the signed playback url should look like this:
  // https://stream.mux.com/${playbackId}.m3u8?token=${token}

  // If you wanted to pass in params for something like a gif, use the
  // params key in the options object
  const gifToken = await mux.jwt.signPlaybackId(playbackId, {
    ...baseOptions,
    type: 'gif',
    params: { time: '10' },
  });
  console.log('gif token', gifToken);

  // Then, use this token in a URL like this:
  // https://image.mux.com/${playbackId}/animated.gif?token=${gifToken}

  // A final example, if you wanted to sign a thumbnail url with a playback restriction
  const thumbnailToken = await mux.jwt.signPlaybackId(playbackId, {
    ...baseOptions,
    type: 'thumbnail',
    params: { playback_restriction_id: YOUR_PLAYBACK_RESTRICTION_ID },
  });
  console.log('thumbnail token', thumbnailToken);

  // When used in a URL, it should look like this:
  // https://image.mux.com/${playbackId}/thumbnail.png?token=${thumbnailToken}
}

```

```php

<?php
  // Using Composer and https://github.com/firebase/php-jwt
  require __DIR__ . '/vendor/autoload.php';
  use \Firebase\JWT\JWT;

  $playbackId = ""; // Enter your signed playback id here
  $keyId = "";      // Enter your signing key id here
  $keySecret = "";  // Enter your base64 encoded private key here

  $payload = array(
    "sub" => $playbackId,
    "aud" => "t",          // v = video, t = thumbnail, g = gif.
    "exp" => time() + 600, // Expiry time in epoch - in this case now + 10 mins
    "kid" => $keyId,

    // Optional, include any additional manipulations
    "time"     => 10,
    "width"    => 640,
    "fit_mode" => "smartcrop"
  );

  $jwt = JWT::encode($payload, base64_decode($keySecret), 'RS256');

  print "$jwt\n";

?>

```

```python

# This example uses pyjwt / cryptography:
# pip install pyjwt
# pip install cryptography

import jwt
import base64
import time

playback_id = ''        # Enter your signed playback id here
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

private_key = base64.b64decode(private_key_base64)

token = {
    'sub': playback_id,
    'exp': int(time.time()) + 3600, # 1 hour
    'aud': 'v'
}
headers = {
    'kid': signing_key_id
}

json_web_token = jwt.encode(
    token, private_key, algorithm="RS256", headers=headers)

print(json_web_token)

```

```ruby

require 'base64'
require 'jwt'

def sign_url(playback_id, audience, expires, signing_key_id, private_key, params = {})
    rsa_private = OpenSSL::PKey::RSA.new(Base64.decode64(private_key))
    payload = {sub: playback_id, exp: expires.to_i, kid: signing_key_id, aud: audience}
    payload.merge!(params)
    JWT.encode(payload, rsa_private, 'RS256')
end

playback_id = ''        # Enter your signed playback id here
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

token = sign_url(playback_id, 'v', Time.now + 3600, signing_key_id, private_key_base64)

```



### Sign Data JWTs

```go

package main

import (
    "encoding/base64"
    "fmt"
    "log"
    "time"
    "github.com/golang-jwt/jwt/v4"
)

func main() {

    myId := ""       // Enter the id for which you would like to get counts here
    myIdType := ""   // Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
    keyId := ""      // Enter your signing key id here
    key := ""        // Enter your base64 encoded private key here

    decodedKey, err := base64.StdEncoding.DecodeString(key)
    if err != nil {
        log.Fatalf("Could not base64 decode private key: %v", err)
    }

    signKey, err := jwt.ParseRSAPrivateKeyFromPEM(decodedKey)
    if err != nil {
        log.Fatalf("Could not parse RSA private key: %v", err)
    }

    token := jwt.NewWithClaims(jwt.SigningMethodRS256, jwt.MapClaims{
        "sub": myId,
        "aud": myIdType,
        "exp": time.Now().Add(time.Minute * 15).Unix(),
        "kid": keyId,
    })

    tokenString, err := token.SignedString(signKey)
    if err != nil {
        log.Fatalf("Could not generate token: %v", err)
    }

    fmt.Println(tokenString)
}

```

```node

// using @mux/mux-node@8

import Mux from '@mux/mux-node';
const mux = new Mux();
const myId = ''; // Enter the id for which you would like to get counts here
const myIdType = ''; // Enter the type of ID provided in myId; one of video_id | asset_id | playback_id | live_stream_id
const signingKeyId = ''; // Enter your Mux signing key id here
const privateKeyBase64 = ''; // Enter your Mux base64 encoded private key here

const getViewerCountsToken = async () => {
    return await mux.jwt.signViewerCounts(myId, {
        expiration: '1 day',
        type: myIdType,
        keyId: signingKeyId,
        keySecret: privateKeyBase64,
    });
};

const sign = async () => {
    const token = await getViewerCountsToken();
    console.log(token);
};

sign();

```

```php

<?php

  // Using https://github.com/firebase/php-jwt

  use \Firebase\JWT\JWT;

  $myId = "";       // Enter the id for which you would like to get counts here
  $myIdType = "";   // Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
  $keyId = "";      // Enter your signing key id here
  $keySecret = "";  // Enter your base64 encoded private key here

  $payload = array(
  "sub" => $myId,
  "aud" => $myIdType,
  "exp" => time() + 600, // Expiry time in epoch - in this case now + 10 mins
  "kid" => $keyId
  );

  $jwt = JWT::encode($payload, base64_decode($keySecret), 'RS256');

  print "$jwt\n";

?>

```

```python

# This example uses pyjwt / cryptography:
# pip install pyjwt
# pip install cryptography

import jwt
import base64
import time

my_id = ''              # Enter the id for which you would like to get counts here
my_id_type = ''         # Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

private_key = base64.b64decode(private_key_base64)

payload = {
    'sub': my_id,
    'aud': my_id_type,
    'exp': int(time.time()) + 3600, # 1 hour
}
headers = {
    'kid': signing_key_id
}

encoded = jwt.encode(payload, private_key, algorithm="RS256", headers=headers)
print(encoded)

```

```ruby

require 'base64'
require 'jwt'

def sign_url(subject, audience, expires, signing_key_id, private_key, params = {})
    rsa_private = OpenSSL::PKey::RSA.new(Base64.decode64(private_key))
    payload = {sub: subject, aud: audience, exp: expires.to_i, kid: signing_key_id}
    payload.merge!(params)
    JWT.encode(payload, rsa_private, 'RS256')
end

my_id = ''                 # Enter the id for which you would like to get counts here
my_id_type = ''            # Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
signing_key_id = ''        # Enter your signing key id here
private_key_base64 = ''    # Enter your base64 encoded private key here

token = sign_url(my_id, my_id_type, Time.now + 3600, signing_key_id, private_key_base64)

```



# Secure video playback
In this guide you will learn how to use signed URLs for securing video playback.
If you add an asset or start a live stream through Mux without passing a playback policy, you'll be unable to access it in your browser using a URL. This may seem counterintuitive at first, however, this gives you the ability to be explicit about who can access your content, as well as exactly how, where, or when they can access it.

There may be instances where you upload a video to Mux that is not intended to be made available for public viewing. For example, maybe you have a membership site that your users must join to access your videos, or you are offering a pay-to-access live stream.

For these scenarios, Mux offers **playback policies** that allow you to control the different ways users can view and interact with your content.

## Understanding playback policies

When you upload a video or initiate a live stream through Mux, you also have the option to define what type of fine-grained access should apply to your content. This is done by specifying a playback policy.

Mux offers two kinds of playback policies: `public` and `signed`.

* **Public** playback policies will enable playback URLs that can be watched anywhere, at any time, without any restrictions. This option is perfect for sharing your viral cat videos with the whole world.
* **Signed** playback policies will enable playback URLs that require a valid JSON Web Token (JWT) to gain access. The JWT should be signed and generated by your application on a protected server, not on a public client.

A playback policy can be specified when you create a new asset or live stream, or can be added to an existing asset or live stream.

Once an asset or live stream has been assigned a playback policy, the asset will be issued a new playback ID that's associated with its corresponding playback policy. It's possible for each asset or live stream to have multiple playback IDs.

See <ApiRefLink href="/docs/api-reference/video/assets/create-asset-playback-id">Create a playback ID</ApiRefLink> to learn how to add a new playback policy and ID to an existing Asset or Live Stream.

**Public** playback policies are pretty self-explanatory, so let’s dig into the signed playback policies.

## A closer look at signed playback policies

When you apply a signed playback policy to your content, there are two distinct ways you can restrict video playback:

1. **Expiration time** (required) allows you to specify a point in time when your issued JWT should be considered expired. Viewers with a valid token can watch videos until your specified expiration time value passes. All HTTP requests made to access your content past the expiration time are denied.
2. **Playback Restrictions** (optional) allow you to implement additional rules for playing videos. For example, let’s consider Referrer Validation. When you create a signed playback policy, you can supply a list of websites that are allowed to host and serve your content. Any requests from domains that aren't on the allow list are denied if they attempt to play back your content.

Referrer and User-Agent Validation Playback Restrictions are supported today; Mux plans to add more types of restrictions in the future.

Let’s walk through a typical workflow for creating a valid JWT used to access a Mux asset with a signed playback policy.

## 1. Create an Asset or Live Stream with a signed playback policy

Let’s start from scratch and add a new asset to our Mux account using a standard authenticated API call. Notice how we set the `playback_policy` value to `signed` during this **Create Asset** request:

```json
// POST https://api.mux.com/video/assets

{
  "inputs": [
    {
      "url": "https://storage.googleapis.com/muxdemofiles/mux-video-intro.mp4"
    }
  ],
  "playback_policies": [
    "signed"
  ],
  "video_quality": "basic"
}
```

## 2. Create a signing key for your Mux account environment

Next, we'll need to create a Mux signing key. Signing keys are used to generate valid JWTs for accessing your content. Signing keys can be managed (created, deleted, listed) from the [Signing Keys settings](https://dashboard.mux.com/settings/signing-keys) of the Mux dashboard or via the Mux System API.

<Callout type="info">
  Remember: Mux signing keys are different than Mux API keys.
</Callout>

When you create a new signing key, the API generates a 2048-bit RSA key-pair and returns the private key and a generated key-id. You should securely store the private key for signing the token, while Mux stores the public key to validate the signed tokens.

Signing keys are created and deleted independently of assets. You probably only need one signing key active at a time, but you can create multiple to enable key rotation, creating a new key and deleting the old one only after any existing signed URLs have expired.

See <ApiRefLink href="/docs/api-reference/system/signing-keys">Create a URL signing key</ApiRefLink> for full documentation.

```json
//POST https://api.mux.com/system/v1/signing-keys

{
  "data": {
    "private_key": "(base64-encoded PEM file with private key)",
    "id": "(unique signing-key identifier)",
    "created_at": "(UNIX Epoch seconds)"
  }
}
```

## 3. Create an optional Playback Restriction for your Mux account environment

Mux supports two types of playback restriction:

* Referrer Validation: Restricts whether the domain specified in the HTTP `Referer` request header or when no referrer domain is specified will be allowed for playback.
* User Agent Validation: Restricts whether a high risk user agent specified in the `User-Agent` request header or when no user agent is specifed will be allowed for playback.

During playback, a restriction is applied using a JWT claim, which will be covered in the next two sections.

<Callout type="info">
  Playback restrictions exist at the environment level. However, creating a playback restriction in an environment does not mean all assets are automatically restricted by it.

  Instead, you should apply a given restriction to a playback by referencing it in the token you create for a signed playback ID.
</Callout>

If you don’t need to use playback restrictions for your content, feel free to jump to the next step.

### Create a Playback Restriction

Most commonly, you want all the videos on your Mux account to be watched only on your website `https://example.com`. To do so, you can create a new Playback Restriction by adding `example.com` domain as the only allowed domain that can play your videos.

See <ApiRefLink href="/docs/api-reference/video/playback-restrictions">Playback Restriction</ApiRefLink> for full documentation.

### Example API Request

```json
//POST https://api.mux.com/video/v1/playback-restrictions

{
  "referrer": {
    "allowed_domains" : [
      "example.com"
	  ],
    "allow_no_referrer" : false
  }
}
```

### Example API Response

```json
{
  "data": {
    "updated_at": "1634595679",
    "referrer": {
      "allowed_domains": [
        "example.com"
      ],
    },
    "id": "JL88SKXTr7r2t9tovH7SoYS8iLBVsjZ2qTuFS8NGAQY",
    "created_at": "1634595679"
  }
}
```

Store the `id` value from the API response above as `PLAYBACK_RESTRICTION_ID` in your application for later use when generating the signed JWT.

### Playback Restriction Syntax

When you create a playback restriction, you may specify the referrer domains and/or user agent restrictions in the same request. The `referrer` field allows you to specify the referrer restrictions and the `user_agent` field allows you to specify the user agent restrictions. For the referrer `allowed_domains` list, you may specify up to 100 unique domains or subdomains where your videos will be embedded.  Specify this in the `referrer.allowed_domains` array using valid DNS-style wildcard syntax. For example:

```json
{
  "referrer": {
    "allowed_domains": [
      "*.example.com",
      "foo.com"
    ],
    "allow_no_referrer": false
  },
  "user_agent": {
    "allow_no_user_agent": false,
    "allow_high_risk_user_agent": false
  }
}
```

Choose from the following options:

* To deny video playback requests for all domains, use an empty Array: `[]`
* To allow playback on `example.com` and all the subdomains of `example.com`, use the syntax: `["*.example.com", "example.com"]`
* Use a single wildcard `*` entry to allow video playback requests from any domain: `["*"]`
* Use a wildcard for one subdomain level. For instance, video playback will be denied from `xyz.foo.example.com` when you include `["*.example.com"]`.

### Playback Restriction considerations

Here are some things to consider when using Playback Restrictions.

* You can create up to 100 different Playback Restrictions per environment on your Mux account.

* You can use a Playback Restriction ID for playing a single video or a group of videos.

* You have a lot of flexibility for associating Playback Restrictions with videos. For instance, you can create one Playback Restriction for each of your clients if your service or application supports multiple clients.

* You can add up to 100 different domains to each Playback Restriction.

* You can restrict playing video on domains added to the Playback Restriction. For instance, if you want multiple partner sites to play a video, you can add the partner site domain to the same Playback Restriction, thereby restricting playback only on those domains.

* If your player supports Chromecast, like Mux Player, make sure you add the Chromecast domain (`www.gstatic.com`) to your playback restrictions, otherwise casting will fail.

* if your player supports AirPlay, like Mux Player, you will only be able to AirPlay to third party devices by adding the AirPlay domain (`mediaservices.cdn-apple.com`) to your playback restrictions. Because first-party Apple devices never forward the referrer header, `allow_no_referrer` must be set to true in order to work on those devices, otherwise airplaying will fail.

[Reach out to Mux Support](mailto:support@mux.com) if you have a use case that requires more than 100 Playback Restrictions or want to add more than 100 domains per Playback Restriction.

### Using `Referer` HTTP Header for validation

Web browsers send the website address requesting the video in the [`Referer` HTTP header](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Referer).
Mux matches the domains configured in the Playback Restriction, with the domain in the `Referer` HTTP header. No video is delivered if there is no match.

The `Referer` HTTP header is only sent by web browsers, while native iOS and Android applications do not send this header. Therefore, Mux cannot perform domain validations on any requests from native iOS and Android applications. For this reason, you can configure the Playback Restrictions to allow or deny all HTTP requests without the `Referer` HTTP header by setting the `allow_no_referrer` boolean parameter.

First party Apple devices, like Apple TV 4K, never set a referrer header regardless of the source. Therefore if airplaying to first party Apple devices is required then `allow_no_referrer` will need to be set to `true` in order to succeed.

Please note that setting `allow_no_referrer` to `true` can result in content playback from unauthorised locations. As such, we strongly recommend creating two Playback Restriction objects, one with `allow_no_referrer` set to `true`, and one set to `false`, and setting the appropriate Playback Restriction ID in the JWT for Web vs. native iOS and/or Android applications.

### Using `User-Agent` HTTP Header for validation

The `User-Agent` HTTP header value is used to validate against a playback restriction. If the `allow_no_user_agent` field is set to false, the playback will be denied if the request does not include an `User-Agent` value.  For the `allow_high_risk_user_agent` validation, Mux maintains a list of user agents that have been known to be associated with higher risk video playback, such as playback devices that are not associated with legitimate end users of most systems. For more information, please reach out to [Mux Support](/support).

## 4. Generate a JSON Web Token (JWT)

All signed requests have a JWT with the following standard claims:

| Claim Code | Description | Value |
| :-- | :-- | :-- |
| sub | Subject of the JWT | Mux Video Playback ID |
| aud | Audience (intended application of the token) | `v` (Video or Subtitles/Closed Captions) <br /> `t` (Thumbnail) <br /> `g` (GIF) <br /> `s` (Storyboard) <br /> `d` (DRM License)|
| exp | Expiration time | UNIX Epoch seconds when the token expires. This should always exceed the current-time plus the duration of the video, else portions of the video may be unplayable. |
| kid | Key Identifier | Key ID returned when signing key was created |

You can also include the following optional claims depending on the type of request.

| Claim Code | Description | Value |
| :-- | :-- | :-- |
| playback\_restriction\_id | Playback Restriction Identifier | `PLAYBACK_RESTRICTION_ID` from the previous step. Mux performs validations when the `PLAYBACK_RESTRICTION_ID` is present to the JWT claims body. This claim is supported for all `aud` types. |

The Image (Thumbnails, Animated GIFs, Storyboard and others) API accepts several options to control image selection and transformations. More details on generating JWT for image can be found [here](/docs/guides/secure-video-playback#note-on-query-parameters-after-signing).

For Playback IDs that use a public policy, the thumbnail options are supplied as query parameters on the request URL.

For Playback IDs that use a signed policy, the thumbnail options must be specified in the JWT claims when using signed URLs. This ensures that the thumbnail options are not altered, such as changing the timestamp or the dimensions of the thumbnail image. For example, if you uploaded a 4K video and wanted to restrict a thumbnail to a width of 600 pixels and a specific timestamp, then simply include the `width` and `time` keys in the JWT claims.

## A note on expiration time

Expiration time should be at least the duration of the Asset or the expected duration of the Live Stream. When the signed URL expires, the URL will no longer be playable, even if playback has already started. Make sure you set the expiration to be sufficiently far in the future so that users do not experience an interruption in playback.

Your application should consider cases where the user loads a video, leaves your application, then comes back later and tries to play the video again. You will likely want to detect this behavior and make sure you fetch a new signed URL to make sure playback can start.

## 5. Sign the JSON Web Token (JWT)

The steps can be summarized as:

1. Load the private key used for signing
2. Assemble the claims (sub, exp, kid, aud, etc) in a map
3. Encode and sign the JWT using the claims map and private key and the RS256 algorithm.

There are dozens of software libraries for creating & reading JWTs. Whether you’re writing in Go, Elixir, Ruby, or a dozen other languages, don’t fret, there is most likely some JWT library you can rely on.

<Callout type="warning">
  The following examples assuming you're working with either a private key returned from the <ApiRefLink href="/docs/api-reference/system/signing-keys">Signing Keys API</ApiRefLink>, or copy & pasted from the Dashboard, **not** when downloaded as a PEM file.
</Callout>

```go

package main

import (
    "encoding/base64"
    "fmt"
    "log"
    "time"

    "github.com/golang-jwt/jwt/v4"
)

func main() {

    playbackId := "" // Enter your signed playback id here
    keyId      := "" // Enter your signing key id here
    key        := "" // Enter your base64 encoded private key here

    decodedKey, err := base64.StdEncoding.DecodeString(key)
    if err != nil {
        log.Fatalf("Could not base64 decode private key: %v", err)
    }

    signKey, err := jwt.ParseRSAPrivateKeyFromPEM(decodedKey)
    if err != nil {
        log.Fatalf("Could not parse RSA private key: %v", err)
    }

    token := jwt.NewWithClaims(jwt.SigningMethodRS256, jwt.MapClaims{
        "sub": playbackId,
        "aud": "v",
        "exp": time.Now().Add(time.Minute * 15).Unix(),
        "kid": keyId,
    })

    tokenString, err := token.SignedString(signKey)
    if err != nil {
        log.Fatalf("Could not generate token: %v", err)
    }

    fmt.Println(tokenString)
}

```

```node

// We've created some helper functions for Node to make your signing-life easier
const Mux = require('@mux/mux-node');
const mux = new Mux();

async function createTokens () {
  const playbackId = ''; // Enter your signed playback id here

  // Set some base options we can use for a few different signing types
  // Type can be either video, thumbnail, gif, or storyboard
  let baseOptions = {
    keyId: '', // Enter your signing key id here
    keySecret: '', // Enter your base64 encoded private key here
    expiration: '7d', // E.g 60, "2 days", "10h", "7d", numeric value interpreted as seconds
  };

  const token = await mux.jwt.signPlaybackId(playbackId, { ...baseOptions, type: 'video' });
  console.log('video token', token);

  // Now the signed playback url should look like this:
  // https://stream.mux.com/${playbackId}.m3u8?token=${token}

  // If you wanted to pass in params for something like a gif, use the
  // params key in the options object
  const gifToken = await mux.jwt.signPlaybackId(playbackId, {
    ...baseOptions,
    type: 'gif',
    params: { time: '10' },
  });
  console.log('gif token', gifToken);

  // Then, use this token in a URL like this:
  // https://image.mux.com/${playbackId}/animated.gif?token=${gifToken}

  // A final example, if you wanted to sign a thumbnail url with a playback restriction
  const thumbnailToken = await mux.jwt.signPlaybackId(playbackId, {
    ...baseOptions,
    type: 'thumbnail',
    params: { playback_restriction_id: YOUR_PLAYBACK_RESTRICTION_ID },
  });
  console.log('thumbnail token', thumbnailToken);

  // When used in a URL, it should look like this:
  // https://image.mux.com/${playbackId}/thumbnail.png?token=${thumbnailToken}
}

```

```php

<?php
  // Using Composer and https://github.com/firebase/php-jwt
  require __DIR__ . '/vendor/autoload.php';
  use \Firebase\JWT\JWT;

  $playbackId = ""; // Enter your signed playback id here
  $keyId = "";      // Enter your signing key id here
  $keySecret = "";  // Enter your base64 encoded private key here

  $payload = array(
    "sub" => $playbackId,
    "aud" => "t",          // v = video, t = thumbnail, g = gif.
    "exp" => time() + 600, // Expiry time in epoch - in this case now + 10 mins
    "kid" => $keyId,

    // Optional, include any additional manipulations
    "time"     => 10,
    "width"    => 640,
    "fit_mode" => "smartcrop"
  );

  $jwt = JWT::encode($payload, base64_decode($keySecret), 'RS256');

  print "$jwt\n";

?>

```

```python

# This example uses pyjwt / cryptography:
# pip install pyjwt
# pip install cryptography

import jwt
import base64
import time

playback_id = ''        # Enter your signed playback id here
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

private_key = base64.b64decode(private_key_base64)

token = {
    'sub': playback_id,
    'exp': int(time.time()) + 3600, # 1 hour
    'aud': 'v'
}
headers = {
    'kid': signing_key_id
}

json_web_token = jwt.encode(
    token, private_key, algorithm="RS256", headers=headers)

print(json_web_token)

```

```ruby

require 'base64'
require 'jwt'

def sign_url(playback_id, audience, expires, signing_key_id, private_key, params = {})
    rsa_private = OpenSSL::PKey::RSA.new(Base64.decode64(private_key))
    payload = {sub: playback_id, exp: expires.to_i, kid: signing_key_id, aud: audience}
    payload.merge!(params)
    JWT.encode(payload, rsa_private, 'RS256')
end

playback_id = ''        # Enter your signed playback id here
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

token = sign_url(playback_id, 'v', Time.now + 3600, signing_key_id, private_key_base64)

```



## 6. Include the JSON Web Token (JWT) in the media URL

Supply the JWT in the resource URL using the `token` query parameter. The Mux Video service will inspect and validate the JWT to make sure the request is allowed.

Video URL example:

```sh
https://stream.mux.com/{playback-id}.m3u8?token={JWT}
```

Thumbnail options are supplied as query parameters when using a public policy. When using a signed policy, the thumbnail options must be specified as claims in the JWT following the same naming conventions as with query parameters.

Thumbnail URL example:

```sh
https://image.mux.com/{playback-id}/thumbnail.{format}?token={JWT}
```

<Callout type="warning" title="Passing `token` for public playback IDs will fail">
  If you include a `token=` query parameter for a `"public"` playback ID, the URL will fail. This is intentional as to not create the false appearance of security when using a public playback ID.

  If your application uses a mix of "public" and "signed" playback IDs, you should save the playback policy type in your database and include the token parameter only for the signed playbacks.
</Callout>

## Note on query parameters after signing

When you're signing a URL, you're signing the parameters for that URL as well. After the parameters are signed for a playback ID, the resulting signed URL should *only* contain the `token` parameter. This is important because leaving the parameters in the URL would both:

* expose more information about the underlying asset than you may want
* result in an incorrect signature since the extraneous parameters would alter the URL.

<Callout type="warning" title="Be sure to include `params` in your `claims` body">
  While the JWT helper we expose in our Node SDK passes in additional parameters as an extra hash, when working with the JWT directly, these `params` should be embedded directly in your `claims` body.
</Callout>

## Example:

Let's say we're taking the following public example and making a signed URL:

* `https://image.mux.com/{public_playback_id}/thumbnail.jpg?time=25`

Generate a signed URL with `{time: 25}` in the **claims body**. Using the helper example we wrote above, this would look like:

* `sign(signedPlaybackId, { ...requiredTokenOptions, params: { time: 25 } })`

**Correct** Signed URL:

* `https://image.mux.com/{signed_playback_id}/thumbnail.jpg?token={token}`

**Bad** Signed URL:

* `https://image.mux.com/{signed_playback_id}/thumbnail.jpg?time=25&token={token}`

Including query parameters in the token also applies to playback modifiers like `default_subtitles_lang`, `redundant_streams` and `roku_trick_play`. The JWT claims body must include the extra parameter:

```json
{
  "sub": "{PLAYBACK_ID}",
  "aud": "{AUDIENCE_TYPE}",
  "exp": "{EXPIRATION_TIME}",
  "redundant_streams": true
}
```

## Passing custom parameters to a signed token

With signed URLs, you can pass extra parameters via a `custom` key in the claims body like the example above.

This may be useful in order to identify bad actors that share signed URLs in an unauthorized way outside of your application. If you find out that a signed URL gets shared then you can decode the parameters and trace it back to the user who shared it. When including extra parameters like this, be sure to respect the following guidelines:

* Do NOT under any circumstances include personally identifiable information (PII) like a name or email address.
* Put your custom parameters nested inside the `"custom"` key.

```json
{
  "sub": "{PLAYBACK_ID}",
  "aud": "{AUDIENCE_TYPE}",
  "exp": "{EXPIRATION_TIME}",
  "custom": {
    "session_id": "xxxx-123"
  }
}
```


# Protect videos with DRM
Learn how to leverage Digital Rights Management (DRM) to protect your videos
## What is DRM?

<Callout type="info">
  Check out our blog on ["What is DRM"](https://www.mux.com/blog/what-is-drm) to learn more about the concepts of DRM.
</Callout>

DRM (Digital Rights Management) provides an extra layer of content security for video content streamed from Mux.

Leveraging DRM blocks or limits the impact of:

* Screen recording
* Screen sharing
* Downloading tools

Mux uses the industry standard protocols for delivering DRM protected video content, specifically Google Widevine, Microsoft PlayReady, and Apple FairPlay.

DRM requires the use of [Signed URLs](/docs/guides/secure-video-playback), and when combined with [Domain and User-Agent restrictions](/docs/guides/secure-video-playback#3-create-an-optional-playback-restriction-for-your-mux-account-environment), can provide a very strong content protection story, up to and including security levels that satisfy the requirements of Hollywood studios.

### How Mux DRM Protects Your Content

Mux Video's DRM is built to support the strongest protection available for each device without affecting playability. This protection comes in three parts.

* **Video encryption:** ensures you can't play the video without the proper license.
* **Screen capture protection:** ensures that you can't take screenshots or record the screen.
* **HDCP:** prevents recording video from video outputs like HDMI.

However, not every device supports all three protection layers. Mux has configured DRM at a security level that ensures broad device compatibility while still providing meaningful protection. The following table shows the types of protection you can expect across different devices:

| Device Type | Encrypted Video | Screen Capture Protection | HDCP Enforced | Details |
| ----- | ----- | ----- | ----- | ----- |
| **iPhone/iPad** | ✅ Yes | ✅ Yes | ✅ Yes | Apple supports hardware-level protection on all devices created since the iPhone 5s. |
| **Modern Android devices** | ✅ Yes | ✅ Usually | ❌ No | Newer devices such as Google Pixel phones or Samsung phones with Android 12+, or any device with Widevine level 1 support can prevent screen capture. |
| **Older, and lower end Android devices** | ✅ Yes |  Sometimes | ❌ No | Many lower end Android devices are missing the secure hardware necessary for Widevine level 3 and hardware decryption. Many of these lower end devices still try to block screen capture, but it's not nearly as secure. |
| **Chrome/Edge browsers on desktop** | ✅ Yes | Sometimes | ❌ No | Browser-based playback usually relies on software decryption. Many of these devices still try to block screen capture but it's not nearly as secure. |

#### Additional protections

If you're distributing premium content with strict security requirements (like major studio releases), you may need additional types of protection. For this you have a few options:

* **Device-level security upgrades:** Some players allow you to request stronger DRM when devices support it. This ensures hardware-backed devices get enhanced protection while others play at baseline levels.
* **Custom DRM configuration**: We're investigating more granular security controls and looking for early partners to help test these features. [Reach out](https://www.mux.com/support/human) for more information.
* **Video watermarking**: Add visible watermarks using Mux's [watermarking](https://www.mux.com/docs/guides/add-watermarks-to-your-videos) feature, which embeds them directly into the video and prevents misattribution. This doesn't support per-user or forensic watermarking, but you can add per-user watermarks by overlaying images on your player - just know these are easier to bypass since they're not baked into the video. Forensic watermarking isn't currently available, but if it would be valuable for your use case, [let us know](https://www.mux.com/support/human).  We'd love to hear your feedback.
* **Trust the DRM Configuration**: Attempting to increase security by detecting device capabilities or security levels yourself will be painful. The device landscape changes constantly and maintaining accuracy is nearly impossible. This is better addressed with custom DRM configurations.

If you need any of these additional protections, or something we've overlooked, [contact us](https://www.mux.com/support/human). We'd love to hear from you.

## Prerequisites

Before you can start using Mux DRM you must complete the onboarding process. The following is a quick overview of the entire process, and you can find additional detail later in this guide.

1. [Request a FairPlay certificate](#step-1-request-a-fairplay-certificate) for playback on Apple Devices. Don't worry about Widevine and PlayReady certificates, we'll handle those for you.
2. While waiting on FairPlay approval, go to Settings -> Digital Rights Management in your Mux dashboard to request DRM access.
3. After DRM is enabled on your environments we'll send you a DRM configuration ID and tell you how to securely send us your FairPlay certificates.

Once these steps are complete you will have your DRM configuration ID and be able to test DRM playback on non-Apple devices. Once you've sent us your FairPlay certificates you can test on Apple devices.

### Step 1: Request a FairPlay certificate

DRM playback will work out of the box on every [supported platform](#supported-platforms) except for Apple devices. Apple requires you to request your own FairPlay Streaming Deployment package (FPS, often simply referred to as a "FairPlay Certificate"). An FPS can only be requested if you meet the following requirements.

1. You have an Apple Developer account with an active subscription.
2. If you're part of a team account, you are logged in as the owner of a team account.

Once you've met these requirements, you will need to fill out a form. The initial questions ask about your DRM infrastructure, which is Mux. Here's some guidance on how to answer those questions.

|    |    |
| :---- | :---- |
| **Does your organization have a working FPS development server where you'll use the FPS certificate?** | Select "Yes" You will use Mux's verified FPS implementation. |
| **Do you have a third-party streaming distribution partner?** | Select "Yes". Mux is that third-party partner. |
| **Streaming Distribution (DRM License Server) Partner Name** | Enter "Mux, Inc.". |
| **Streaming Distribution (DRM License Server) Partner Website** | Enter "https://mux.com". |
| **Your Company** | Describe your company and the services they provide. |
| **Your Content** | Describe the type of content you will be protecting with FairPlay and why that content needs DRM. |
| **Do you own the content you want to stream?** | If you hold full copyright ownership of your content, select "Yes". Otherwise select "No" and answer the following two additional questions that appear: |
| **Do you have a content licensing agreement with the owner of the content?** | If you license third-party content, select "Yes". |
| **Your Content Provider** | If you license third-party content, include the name of that provider and a description of the rights you have to use their content. |
| **Is this your first request for FPS credentials?** | If this is your first time submitting this form or requesting a FairPlay certificate, select "Yes".
| **Do you assert that the account holder of this developer account owns, or has a license to use, the content that you will be streaming?** | Select the most appropriate answer for your situation. |

Once you've submitted the request, approval may take several days. Once approved, Apple will provide documentation for generating the final certificate. This includes generating a private key via your terminal and filling out a form.

Once you've created your FairPlay deployment package, [contact us](/support/human) and we'll walk you through securely sending us the files.

<Callout type="info">
  You do not need to request a Widevine or PlayReady certificate, Mux manages these for you.
</Callout>

### Step 2: Request DRM for your environment

Go to Settings -> Digital Rights Management to request DRM access. This page will walk you through the necessary requirements and allow you to request access. You will receive a response via email with next steps.

### Step 3: Receive your DRM configuration ID

After we enable DRM on your environment you can find your DRM configuration ID in Settings -> Digital Rights Management. You'll need this when you add a DRM playback ID to an asset.

You can also use the <ApiRefLink href="/docs/api-reference/video/drm-configurations">DRM Configurations API</ApiRefLink> to list the DRM Configurations available to your account.

## Create a DRM protected asset or live stream

Mux Video supports applying DRM to both live streams and assets.

### Creating a DRM protected asset

When using the <ApiRefLink href="/docs/api-reference/video/assets/create-asset">Create Asset API</ApiRefLink>, you can add DRM protection by including `advanced_playback_policies` with your DRM configuration ID. Make sure to set the `video_quality` to `plus` or `premium`, as DRM is only supported on these quality levels.

```json
// POST /video/v1/assets
{
  "inputs": [
    {
      "url": "https://storage.googleapis.com/muxdemofiles/mux.mp4"
    }
  ],
  "advanced_playback_policies": [
    {
      "policy": "drm",
      "drm_configuration_id": "your-drm-configuration-id"
    }
  ],
  "video_quality": "plus"
}
```

When working with `advanced_playback_policies`, keep in mind that you can't use both the `playback_policy` field and `advanced_playback_policies` field in the same request. When working with DRM, stick to `advanced_playback_policies`. If you need more than one playback policy, such as for static renditions, you can include multiple policies in the `advanced_playback_policies` array.

If you need to add DRM protection to an existing asset, you can use the <ApiRefLink href="/docs/api-reference/video/assets/create-asset-playback-id">Playback IDs API</ApiRefLink> to retroactively add a DRM playback policy. This works for any asset created after DRM was enabled in your environment.

### Creating a DRM protected live stream

Just like creating a DRM protected asset, DRM protected live streams require configuration ID must be set in the live stream's `advanced_playback_policies`.

In the example below, we also set the `new_asset_settings` to also use DRM, so any [DVR assets](/docs/guides/live-streaming-faqs#is-it-possible-to-rewind-live-content-while-the-live-stream-continues) and on-demand assets also have DRM applied.

```json
// POST /video/v1/live-streams
{
  "advanced_playback_policies": [
    {
      "policy": "drm",
      "drm_configuration_id": "your-drm-configuration-id"
    }
  ],
  "new_asset_settings": {
    "advanced_playback_policies": [
      {
        "policy": "drm",
        "drm_configuration_id": "your-drm-configuration-id"
      }
    ]
  }
}
```

## Play DRM protected videos

Mux supports three types of DRM: Widevine, FairPlay, and PlayReady. These three DRM systems cover the vast majority of devices in use today, including [desktop browsers](#desktop-browsers), [mobile browsers](#mobile-browsers), and [living room devices (OTT)](#living-room-devices-ott).

### Supported Platforms

Before you start to build your DRM integration, make sure Mux's DRM supports your target platforms. Mux's DRM is verified to work on all of the following platforms, but likely works on additional platforms. If you would like us to verify an additional platform, please [contact us](/support/human).

##### Desktop Browsers

The following desktop browsers support Mux DRM via the [Mux Web Player](#mux-web-player), or any of the players listed in our [player documentation](#playback-in-mux-players).

* Chrome (macOS and Windows)
* Firefox (macOS and Windows)
* Safari (macOS)
* Edge (Windows)
* Legacy Edge (Windows)

##### Mobile Browsers

The following mobile browsers support Mux DRM via the [Mux Web Player](#mux-web-player), or any of the players listed in our [player documentation](#playback-in-mux-players).

* Chrome on Android
* Firefox on Android
* All browsers on iOS

##### Native Mobile Apps

* Android apps using [Mux Player for Android](#mux-player-android)
* iOS apps using [Mux Player for iOS](#mux-player-ios)

##### Living room devices (OTT)

The following living room devices support Mux DRM, and link to the relevant documentation.

* [Chromecast](#chromecast)
* [Google TV](#mux-player-android)
* [Apple TV (tvOS)](#mux-player-ios)
* [Roku](#roku)
* [Fire TV](#mux-player-android)

### Creating your playback and license tokens

To successfully play back content protected by Mux DRM you will need your asset's playback ID and two secure tokens; a playback token and a DRM license token. These tokens are both signed using the JWT requirements laid out in our [secure video playback guide](//www.mux.com/docs/guides/secure-video-playback#2-create-a-signing-key-for-your-mux-account-environment).<br /><br />

If you're using our node library to sign your license URLs we offer a helper function:

```js
const mux = new Mux({ 
  tokenId: "your-access-token-id", 
  tokenSecret: "your-access-token-secret", 
  jwtSigningKey: "your-environment-signing-public-key",
  jwtPrivateKey: "your-environment-signing-private-key"
});

const playbackToken = await mux.jwt.signPlaybackId("your-playback-id", {expiration: '7d'});
const drmLicenseToken = await mux.jwt.signDrmLicense("your-playback-id", {expiration: '7d'});
```

Once you have created your signing tokens, you can use them directly in a [Mux player](#playback-in-mux-players). If you are using a non-Mux player you use these tokens to build your [playback and license URLs](#creating-license-urls).

### Playback in Mux players

Now that you have the necessary tokens, we can hook them into a Mux Player.

#### Mux Web Player

To play back DRM protected content, you should instantiate the player with the new `drm-token` parameter set to the DRM license token that you generated [previously](#creating-your-playback-and-license-tokens). In Mux Player React, you'll use the `tokens` prop to set the `drm` token.

<Callout type="info">
  Support for DRM in Mux Player was added in version 2.8.0.
</Callout>

```embed

<iframe
  src="https://player.mux.com/your-playback-id?drm-token=your-drm-token&playback-token=your-playback-token&thumbnail-token=your-thumbnail-token&storyboard-token=your-storyboard-token"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html
<mux-player
  playback-id="your-playback-id"
  playback-token="your-playback-token"
  drm-token="your-drm-token"
  thumbnail-token="your-thumbnail-token"
  storyboard-token="your-storyboard-token"
></mux-player>
```

```react

<MuxPlayer
  playbackId="your-playback-id"
  tokens={{
    playback: "your-playback-token",
    drm: "your-drm-token",
    thumbnail: "your-thumbnail-token",
    storyboard: "your-storyboard-token",
  }}
/>
```



[You can see a demo of this working in codesandbox here.](https://codesandbox.io/p/sandbox/mux-player-drm-test-5qh2pm?file=%2Findex.html%3A19%2C7)

With your new tokens all wired up correctly, you should be able to play back your freshly DRM protected content!

[*Here's a demo page with some pre-prepared DRM protected content you can also test a device against.*](https://5qh2pm.csb.app/)

[Full documentation for using DRM with Mux Player for web can be found here.](/docs/guides/player-advanced-usage#using-digital-rights-management-drm)

#### Mux Player iOS

<Callout type="info">
  Support for DRM in Mux Player for iOS was added in version 1.1.0.
</Callout>

The DRM license token can be configured on `PlaybackOptions` using the following API:

```swift
let playbackOptions = PlaybackOptions(
  drmToken: "your-drm-license-token", 
  playbackToken: "your-playback-token",
)

let playerItem = AVPlayerItem(
  playbackID: "your-playback-id",
  playbackOptions: playbackOptions
)
```

[Full documentation for using DRM Mux Player for iOS can be found here.](/docs/guides/mux-player-ios#secure-your-playback-experience)

#### Mux Player Android

The DRM license token can be configured when instantiating a `MediaItem` using the `MediaItems` factory class as follows:

<Callout type="info">
  Support for DRM in Mux Player for Android was added in version 1.1.0.
</Callout>

```kotlin
// You don't need to add your own DrmSessionManager, we take care of this

val player = // Whatever you were already doing

val mediaItem = MediaItems.mediaItemFromPlaybackId(
  playbackId = "your-playback-id",
  playbackToken = "your-playback-token",
  drmToken = "your-drm-license-token"
)

// Normal media3 boilerplate
player.setMediaItem(mediaItem)
player.prepare()
```

[Full documentation for using DRM Mux Player for Android can be found here.](/docs/guides/mux-player-android#secure-your-playback-experience)

### Playback in third-party players

If you can't use one of the [Mux players](#playback-in-mux-players), you still have options. Mux's DRM is compatible with a wide range of third party players. Because these players don't know exactly how Mux's DRM works, you'll need to build the correct [playback and license URLs](#creating-license-urls), then add them to the player.

#### Creating license URLs

The following examples demonstrate the license URL structure for each of the supported DRM providers.

##### Widevine

```
https://license.mux.com/license/widevine/{playback-id}?token={drm-license-token}
```

##### FairPlay

Before you can use FairPlay DRM, you must [request the proper certificate from Apple](#step-1-request-a-fairplay-certificate). Once FairPlay is enabled on your account, you will use one license url and one certificate URL.

**License URL**

```
https://license.mux.com/license/fairplay/{playback-id}?token={drm-license-token}
```

**Certificate URL**

```

https://license.mux.com/appcert/fairplay/{playback-id}?token={drm-license-token}
```

##### PlayReady

```
https://license.mux.com/license/playready/{playback-id}?token={drm-license-token}
```

#### Third party players

The following third-party players have been tested with Mux DRM. If you are using a player not listed here, check out our notes on [other players](#other-players).

##### Roku

In order to play back DRM protected content in Roku, add your DRM Configuration to your content node. This includes the following:

1. Add the following two fields to your channel's manifest:
   ```jsx
   requires_widevine_drm=1
   requires_widevine_version=1.0
   ```
2. When preparing your `contentNode`, ensure you reference the DRM configuration and license URL as follows:

```jsx
drmParams = {
  keySystem: "Widevine",
  licenseServerURL: "https://license.mux.com/license/widevine/${PLAYBACK_ID}?token=${DRM_LICENSE_JWT}"
}

contentNode = CreateObject("roSGNode", "ContentNode")
contentNode.url = "<content URL>"
contentNode.drmParams = drmParams
contentNode.title = "<your title>"
contentNode.length = <duration in seconds>

' other contentNode properties can be added here, 
' then play your video as you normally would
```

##### Chromecast

Chromecast devices use Google Cast to send videos from one device to another. There are quite a few steps to set up Google Cast, so we recommend you check out our [Google Cast guide](/docs/guides/play-drm-protected-videos-on-google-cast) for more details.

##### HLS.js

[HLS.js](https://github.com/video-dev/hls.js?tab=readme-ov-file) supports DRM via configuration keys in any browser with native [MSE](https://developer.mozilla.org/en-US/docs/Web/API/Media_Source_Extensions_API) support (e.g. old versions of Safari). For browsers that do not support MSE, you will need to use the native video element. Both flows are included in the example below.

```js
// This browser supports MSE and EME, so we can use hls.js
if (Hls.isSupported()) {
  var hls = new Hls({
    emeEnabled: true,
    drmSystems: {
      'com.widevine.alpha': {
        licenseUrl: 'https://license.mux.com/license/widevine/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}'
      },
      'com.microsoft.playready': {
        licenseUrl: 'https://license.mux.com/license/playready/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}'
      },
      'com.apple.fps': {
        licenseUrl: 'https://license.mux.com/license/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
        serverCertificateUrl: 'https://license.mux.com/appcert/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
      }
    }
  });
// This browser supports EME but not MSE, so we need to use the native video element
} else if (video.canPlayType('application/x-mpegURL')) {
  video.src = mediaUrl;

  video.addEventListener('encrypted', async function(event) {
    const initDataType = event.initDataType;
    const initData = event.initData;

    // Retrieve a MediaKeySystemAccess object to interact with the DRM system
    const access = await navigator.requestMediaKeySystemAccess('com.apple.fps', [{
        initDataTypes: [initDataType],
        videoCapabilities: [{ contentType: 'application/vnd.apple.mpegurl', robustness: '' }],
        distinctiveIdentifier: 'not-allowed',
        persistentState: 'not-allowed',
        sessionTypes: ['temporary'],
    }]);

    if (!access) {
        console.error('Cannot play DRM-protected content with current security configuration on this browser. Try playing in another browser.');
        return;
    }

    // Create DRM keys
    const keys = await access.createMediaKeys();

    // Get the FairPlay license and certificate
    const certificate = await fetch('https://license.mux.com/appcert/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}').then(async (res) => {
        const keyBuffer = await res.arrayBuffer();
        return new Uint8Array(keyBuffer);
    });

    if (!certificate) {
        console.error('Failed to fetch certificate');
        return;
    }

    // Attach the certificate to the DRM keys
    await keys.setServerCertificate(certificate);

    // Attach the keys to the video element
    await video.setMediaKeys(keys);

    // Create a playback session
    const session = (video.mediaKeys).createSession();

    // Create the data necessary to make a DRM license request
    const message = await new Promise((resolve, reject) => {
        session.generateRequest(initDataType, initData);
        session.addEventListener('message', (messageEvent) => {
            resolve(messageEvent.message);
        }, { once: true });
    });

    // Get a DRM license
    const response = await fetch('https://license.mux.com/license/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}', {
        method: 'POST',
        headers: { 'Content-type': 'application/octet-stream' },
        body: message,
    });

    // Attach the license key to the session
    const licenseData = await response.arrayBuffer();
    await session.update(licenseData);
  });
}
```

For more details, check out [the HLS.js DRM docs](https://github.com/video-dev/hls.js/blob/master/docs/API.md#drmsystems).

##### Video.js

[Video.js](http://videojs.com) supports DRM via the [`videojs-contrib-eme` plugin](https://github.com/videojs/videojs-contrib-eme).

```js
const player = videojs('vid1', {});

player.eme();
player.src({
  src: 'https://stream.mux.com/{playback-id}.m3u8?token={JWT}',
  type: 'application/x-mpegURL',
  keySystems: {
    'com.widevine.alpha': 'https://license.mux.com/license/widevine/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
    'com.apple.fps.1_0': {
      certificateUri: 'https://license.mux.com/appcert/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
      licenseUri: 'https://license.mux.com/license/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
    },
    'com.microsoft.playready': 'https://license.mux.com/license/playready/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}'
  }
});
```

For more details check, out the [videojs-contrib-eme docs](https://github.com/videojs/videojs-contrib-eme?tab=readme-ov-file#initialization).

##### Shaka player

Shaka player supports DRM via configuration keys.

```js
player.configure({
  drm: {
    servers: {
      'com.widevine.alpha': 'https://license.mux.com/license/widevine/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
      'com.apple.fps.1_0': 'https://license.mux.com/license/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
      'com.microsoft.playready': 'https://license.mux.com/license/playready/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}'
    },
    advanced: {
      'com.apple.fps.1_0': {
        serverCertificateUri: 'https://license.mux.com/appcert/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}'
      }
    }
  }
});
```

For more details, check out the [Shaka player DRM docs](https://shaka-player-demo.appspot.com/docs/api/tutorial-drm-config.html).

##### Other players

While we have only tested playback with our supported players, there are many others that will work just fine. If your platform supports playback of HLS, CMAF packaged streams, with Widevine, PlayReady, or FairPlay DRM, using CBCS encryption, then Mux might work by following the [custom players guide](#other-players). If you would like us to support additional platforms, [let us know](/support/human).

### Testing that DRM is working

Checking your video is DRM protected is pretty simple: just take a screenshot! If DRM is working correctly, you should see the video replaced with either a black rectangle, or a single frame from the start of the video.

## Configure DRM security levels

Currently Mux's DRM feature defaults to a balance of security and playability, including automatically leveraging higher security levels on devices where this is available.

At times customers may want to adjust this balance to increase security levels, specifically for example to meet contractual Hollywood studio security requirements. [Please contact us](/support) if you need to discuss or adjust the security levels used.

In the future, we will allow self-serve adjustment of security levels through the DRM Configurations API.

In line with common industry practices, only video tracks are currently DRM protected, meaning that audio-only assets and audio-only live streams are not protected by DRM.

## Pricing

DRM is an add-on feature to Mux Video, with a $100/month access fee + $0.003 "[per license](#what-is-a-drm-license)", and discounts available for high volumes.

### What is a DRM license?

One DRM license request typically corresponds to one video view. When a viewer starts watching a DRM-protected video, the player requests a license to decrypt and play the content. While licenses and video views usually line up, the exact count can vary depending on each player's caching behavior.


# Play DRM protected videos on Google Cast Devices
Learn how to use Google Cast with DRM-protected Mux Video content.
Google Cast is a popular method for sending video from one device to be played on another, often from a phone to a TV. Most players support Google Cast out of the box, but if your video is protected by DRM, you will need to do a little more work.

## Overview

Google Cast integrations are made up of two parts, a "sender" and a "receiver". In the example of a phone casting to a TV, the player in a mobile browser is the sender and the receiver is a webpage sent to the TV.

There are quite a few steps in this process, so here's a quick overview:

1. Create a sender, either with the Mux Player or by writing your own
2. Create your playback ID, playback token and DRM token, and add them to the sender
3. Register your test device in the Google Cast dashboard
4. Create a custom receiver and register it on the Google Cast dashboard
5. Add the registered receiver ID to the custom sender

## Sender Setup

A Google Cast sender is an app with a "cast" button. Clicking the cast button performs two actions.

1. If you're not yet connected to another device, the cast button helps set up that connection.
2. Once you're connected to a device, the sender sends everything needed to play a video to the receiver.

In this section we'll walk through how to write your own web-based Google Cast sender.

### Mux Player Sender

The easiest way to set up your own sender is with [Mux Player for Web](/docs/guides/mux-player-web). In addition to the usual `playback-id` property, you'll need to include additional security and Google Cast fields:

* `playback-token`: A signed playback token, as documented in our [Guide to Secured Video Playback](/docs/guides/secure-video-playback)
* `drm-token`: A signed DRM token, as documented in the [Sign a DRM license token](/docs/guides/protect-videos-with-drm#sign-a-drm-license-token) section of our DRM guide
* `cast-receiver`: The application ID of your custom receiver, as documented in the [Custom Receiver](#receiver-setup) section below

<Callout type="info">
  This is only supported in Mux Player version 3.4.1 or greater
</Callout>

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>
<mux-player
  id="player"
  playback-id="your-playback-id"
  playback-token="your-playback-token"
  drm-token="your-drm-token"
  cast-receiver="your-cast-receiver-app-id"
></mux-player>
```

### Custom Web Sender

If you're not using Mux Player, Google offers SDKs for [Web](https://developers.google.com/cast/docs/web_sender), [Android](https://developers.google.com/cast/docs/android_sender) and [iOS](https://developers.google.com/cast/docs/ios_sender). In this section we'll walk through installing the Web SDK and sending a video with DRM to a custom receiver.

#### Requirements

Before you build your own custom web sender, you'll need to create a [playback token](https://www.mux.com/docs/guides/secure-video-playback) and a [DRM license token](https://www.mux.com/docs/guides/protect-videos-with-drm#sign-a-drm-license-token).

**Note:** Google Cast and DRM only works in secure contexts, such as HTTPS or localhost.

#### Import Cast SDK

Include the following script wherever you want to show a cast button.

```html
<script src="https://www.gstatic.com/cv/js/sender/v1/cast_sender.js?loadCastFramework=1"></script>
```

#### Configure Cast SDK

Before we can cast any videos we need to configure the cast context. The cast framework gives us a great place to do that, in the global `__onGCastApiAvailable` function.

```javascript
window['__onGCastApiAvailable'] = function(isAvailable) {
  if (isAvailable) {
    cast.framework.CastContext.getInstance().setOptions({
      receiverApplicationId: 'your-receiver-app-id',
      autoJoinPolicy: chrome.cast.AutoJoinPolicy.ORIGIN_SCOPED,
    });
  }
};
```

For DRM, you'll need to build your own receiver and add the ID to `receiverApplicationId`. We can help you with that in our [custom receiver guide](#receiver-setup).

#### Send Video to Receiver

Let's write a function to encapsulate sending our video over to the receiver.

First we're going to collect all the data we need to play the video in variables at the top.

```javascript
function playVideo(context) {
  const playbackId = 'your-playback-id';
  const playbackToken = 'your-playback-token';
  const drmToken = 'your-drm-token';
  const mediaUrl = `https://stream.mux.com/${playbackId}.m3u8?token=${playbackToken}`;
```

Each of these variables help identify a single asset.

* `playbackId`: An asset can have one or many playback IDs. This is different from the asset ID. You can find it in the API, or in the Mux Dashboard.
* `playbackToken`: A signed playback token, as documented in our [Guide to Secured Video Playback](https://www.mux.com/docs/guides/secure-video-playback).
* `drmToken`: A signed DRM token, as documented in the [Sign a DRM license token](https://www.mux.com/docs/guides/protect-videos-with-drm#sign-a-drm-license-token) section of our DRM guide.

Then we'll build a `MediaInfo` object to include all the information Google Cast needs to play the video.

```javascript
let mediaInfo = new chrome.cast.media.MediaInfo(mediaUrl, 'application/x-mpegurl');

// Mux HLS URLs with DRM will always use `fmp4` segments.
mediaInfo.hlsSegmentFormat = chrome.cast.media.HlsSegmentFormat.FMP4;
mediaInfo.hlsVideoSegmentFormat = chrome.cast.media.HlsVideoSegmentFormat.FMP4;

// Send the information needed to create a new license url.
mediaInfo.customData = {
  mux: {
    playbackId,
    tokens: {
      drm: drmToken
    }
  }
}
```

And finally we'll ask the receiver to load the video.

```javascript
const request = new chrome.cast.media.LoadRequest(mediaInfo);

// Cast the video.
context.getCurrentSession().loadMedia(request).then(() => {
  console.log('Successfully loaded the media');
}).catch((err) => {
  console.log(`Media playback error code: ${err}`);
});
```

Here's the function in its entirety:

<CollapsibleRoot>
  <CollapsibleTrigger>
    View example code
  </CollapsibleTrigger>

  <CollapsibleContent>
    ```javascript
    function playVideo(context) {
      const playbackId = 'your-playback-id';
      const playbackToken = 'your-playback-token';
      const drmToken = 'your-drm-token';
      const mediaUrl = `https://stream.mux.com/${playbackId}.m3u8?token=${playbackToken}`;
      let mediaInfo = new chrome.cast.media.MediaInfo(mediaUrl, 'application/x-mpegurl');

      // Mux HLS URLs with DRM will always use `fmp4` segments.
      mediaInfo.hlsSegmentFormat = chrome.cast.media.HlsSegmentFormat.FMP4;
      mediaInfo.hlsVideoSegmentFormat = chrome.cast.media.HlsVideoSegmentFormat.FMP4;

      // Send the information needed to create a new license url.
      mediaInfo.customData = {
        mux: {
          playbackId,
          tokens: {
            drm: drmToken
          }
        }
      }

      const request = new chrome.cast.media.LoadRequest(mediaInfo);

      // Cast the video.
      context.getCurrentSession().loadMedia(request).then(() => {
        console.log('Load Succeeded');
      }).catch((err) => {
        console.log(`Error code: ${errorCode}`);
      });
    }
    ```
  </CollapsibleContent>
</CollapsibleRoot>

Now we need to hook it up to the cast action. In this example we'll send it as soon as possible by listening for the cast session to start.

```javascript
let context = cast.framework.CastContext.getInstance();
context.addEventListener(cast.framework.CastContextEventType.SESSION_STATE_CHANGED, function(event) {
  switch (event.sessionState) {
    case cast.framework.SessionState.SESSION_STARTED:
    case cast.framework.SessionState.SESSION_RESUMED:
      playVideo(context);
      break;
  }
});
```

#### Add Cast Button

Once all your code is hooked up, adding the button is the easiest part. Put this in the HTML of your page and you're good to go.

```html
<google-cast-launcher>Launch</google-cast-launcher>
```

**More Docs**

* [JavaScript SDK](https://developers.google.com/cast/docs/web_sender)
* [iOS SDK](https://developers.google.com/cast/docs/ios_sender)
* [Android SDK](https://developers.google.com/cast/docs/android_sender)

## Receiver Setup

A Google Cast receiver is a web page that receives data from a sender and plays your video. This is always a webpage, written in HTML and JavaScript, even if you're using a mobile SDK for casting.

### Receiver Prerequisites

Before we can write the code, we need to take care of a couple prerequisites.

1. You need a way for devices to access this web page with a public URL. We recommend either you host the files yourself, or use [ngrok](https://ngrok.com/docs/getting-started/) during development to expose local files on a public URL.
2. You need to register the receiver's public URL with Google in the [Google Cast SDK Developer Console](https://cast.google.com/u/1/publish/). If you have not joined the Google Cast Developer program, do so now.
3. Once you have registered your receiver, you will see your app listed in the [dashboard](https://cast.google.com/u/0/publish/) with a unique Application ID. This is the same receiver ID you will configure in the [Mux Player](https://www.mux.com/docs/guides/mux-player-web) or [Custom Sender](#sender-setup).
4. Until your app is published, you can only cast to registered development devices. This registration requires the destination device's serial number. If you can't find the serial number on the outside of the device, you can use the Chrome browser to cast the [dashboard](https://developers.google.com/cast/codelabs/cast-receiver) directly to the device. This will show a new screen, prominently displaying the device's serial number.

### Receiver Implementation

Google's custom receiver SDK has a lot of functionality built in, so we don't have to do a lot of work. The only thing we need to do is manage the DRM license URL. Before we start working with License URLs, we'll want a small HTML file to describe the playback UI. Here's a minimal example.

```html
<!DOCTYPE html>
<html lang="en">
    <head>
        <meta charset="utf-8">
        <title></title>
        <!-- Web Receiver SDK -->
        <script src="//www.gstatic.com/cast/sdk/libs/caf_receiver/v3/cast_receiver_framework.js"></script>
        <!-- Cast Debug Logger -->
        <script src="//www.gstatic.com/cast/sdk/libs/devtools/debug_layer/caf_receiver_logger.js"></script>
    </head>

    <body>
        <cast-media-player></cast-media-player>
        <footer>
            <script src="js/receiver.js"></script>
        </footer>
    </body>
</html>
```

In the above example, the two script tags in the header load the SDK and the Google Cast logger. Further on you'll see a `<cast-media-player>`, which the Google Cast SDK automatically turns into a video player, and a script for managing the DRM.

Now let's create that script. To start, we're going to grab a reference to the cast context and configure the logger.

```javascript
const context = cast.framework.CastReceiverContext.getInstance();

/**
 * DEBUGGING
 */
const castDebugLogger = cast.debug.CastDebugLogger.getInstance();
const LOG_TAG = 'MUX';
castDebugLogger.setEnabled(true);

// Debug overlay on tv screen.
// You don't need this if you're debugging using the cast tool (https://casttool.appspot.com/cactool) as it will show the logs in your browser.
castDebugLogger.showDebugLogs(true);

castDebugLogger.loggerLevelByTags = {
  [LOG_TAG]: cast.framework.LoggerLevel.DEBUG,
};
```

Next we're going to intercept all playback requests and see if the request includes DRM license information.

```javascript
context.getPlayerManager().setMediaPlaybackInfoHandler((loadRequest, playbackConfig) => {
  const customData = loadRequest.media.customData || {};

  if(customData.mux && customData.mux.tokens.drm){
```

In that conditional, let's build our license URL and add it to the `playbackConfig`.

```javascript
    playbackConfig.licenseUrl = `https://license.mux.com/license/widevine/${customData.mux.playbackId}?token=${customData.mux.tokens.drm}`;
  }

  playbackConfig.protectionSystem = cast.framework.ContentProtection.WIDEVINE;
  castDebugLogger.debug(LOG_TAG, 'license url', playbackConfig.licenseUrl);

  return playbackConfig;
});
```

Finally, we start listening for incoming requests with the line

```javascript
context.start();
```

Click the button below to view the full JavaScript file.

<CollapsibleRoot>
  <CollapsibleTrigger>
    View example code
  </CollapsibleTrigger>

  <CollapsibleContent>
    ```javascript
    /**
     * DEBUGGING
     */
    // https://developers.google.com/cast/docs/debugging/cast_debug_logger
    const castDebugLogger = cast.debug.CastDebugLogger.getInstance();
    const LOG_TAG = 'MUX';
    castDebugLogger.setEnabled(true);

    // Debug overlay on tv screen. You don't need this if you're debugging using the cast tool (https://casttool.appspot.com/cactool) as it will show the logs in your browser.
    castDebugLogger.showDebugLogs(true);

    castDebugLogger.loggerLevelByTags = {
        [LOG_TAG]: cast.framework.LoggerLevel.DEBUG,
    };

    /**
     * DRM SUPPORT
     */
    context.getPlayerManager().setMediaPlaybackInfoHandler((loadRequest, playbackConfig) => {
      const customData = loadRequest.media.customData || {};

      if(customData.mux && customData.mux.tokens.drm){
        castDebugLogger.debug(LOG_TAG, 'Setting license URL.');
        playbackConfig.licenseUrl = `https://license.mux.com/license/widevine/${customData.mux.playbackId}?token=${customData.mux.tokens.drm}`;
      }

      playbackConfig.protectionSystem = cast.framework.ContentProtection.WIDEVINE;

      castDebugLogger.debug(LOG_TAG, 'license url', playbackConfig.licenseUrl);

      return playbackConfig;
    });

    /**
     * START LISTENING FOR CASTS
     */
    context.start();
    ```
  </CollapsibleContent>
</CollapsibleRoot>

## More docs

* [Google's custom receiver docs](https://developers.google.com/cast/codelabs/cast-receiver#0)
* [Debug logger docs](https://developers.google.com/cast/docs/debugging/cast_debug_logger)
* [Adding the Mux Data SDK to Chromecast](https://www.mux.com/docs/guides/monitor-chromecast)

## Testing

Once you've completed every step (Triple check the [overview](#overview) steps!) load up your sender, click the cast button and choose to cast to your test device. After a quick loading screen your DRM-protected video will start playing.

If you're testing our example, the video will automatically start playing behind the cast log. You can remove the cast log by commenting out the line `castDebugLogger.setEnabled(true);` in your custom receiver.


# Limit which Environments a user has access to in the Dashboard
Learn how to restrict which Environments a user can see in the Dashboard
This feature allows Admins to limit which Environments a given user can access within the Dashboard.

## Admin use of Environment restrictions

All management of Environment access is done in the Dashboard under **User > Organization**.

Admins have the following permissions:

* Access to all Environments
* View what Environments any given user has access to
* See what users can access a specific Environment
* Apply Environment restrictions to a user invitation
* Manage Environment restrictions for all users

## Inviting new users with Environment restrictions

Upon inviting a new user to an organization, Admins must provide at least one (1) Environment that the new user can access.

## Modify Environment access for existing users

Admins can modify the Environment restrictions for users at any time. Clicking on a user under **User > Organization** will display a list of the Environments that user has access to in the Dashboard, which can be toggled on/off and applied on Save. All users must have access to at least one (1) environment.
