Using Windows Azure Media Services .NET SDK with key concepts explained

This blog explains the demo of using .NET SDK to complete a typical video-on-demand workflow – upload, encode, package and stream.

1. Create a C# console application through Visual Studio and install Nuget package: windowsazure.mediaservices.

2. I upload a little sample video file for you to use: http://mingfeiy.com/wp-content/uploads/2013/06/azure.wmv. And I put this video file under “C:\tr\“. Therefore, I suggest you create the same folder under your C drive so you don’t need to change input video file path.

3. Here are two config XML you are going to use as video presents: Smooth Streaming Config xml and Http-live-streaming config xml. Similarly, I put them under “C:\ty\”. Certainly, you could get this config from MSDN but since it is a very long string, it is easy to just read from a xml file.

4. This what your main method looks like. You need to change acc_name and acc_key as your media services account name and key, which you could find in media services portal. In main method, it shows what we are going to do – firstly, create asset and upload asset files; secondly, encode and package assets into Mp4, Smooth Streaming and HLS;lastly, get streaming URL for each asset.

5. Here is the code for uploading asset into blob storage:

Question: What’s the different between “asset” and container concept in blob storage?

Answer: Asset is a conceptual entity in media services. When you create an asset and upload asset files associated with that asset, media services will create locked container on your behalf. When you want to upload a file, we will request a Shared access signature upload url, which allows us to utilize Storage SDK to upload your video content into blob storage.

Question: Could I create a container and upload files through storage SDK directly, then identify asset and asset files in that container?

Answer: No, you can’t do that now. We need to have an asset to container mapping while creating the assets.

6. After video file is in the system, we could use Media Encoder and Media Packager to transform the video. This is how encode and package method showed as below. Here we introduced the concept of Job, which could contain multiple tasks. And for each task, you need to define input asset, output asset and task presents (how you want video to be transformed). These tasks could be chained, which means the output for task 1 could become input for task 2. And the job template could be saved for future use.

Question: what’s IMediaProcessor and why should I query it?

Answer: IMediaProcessor carries all the encoding presents Media Services provided. For now, we have first-party Windows Azure Media Services and in the future, we may have partners encoding algorithm, such as “Digital rapids media encoder”. By querying it, you are telling our VM which algorithm to run. Once you query out the right set of media processors, you could use different task presents based on strings, which you could find more details here: http://msdn.microsoft.com/en-us/library/windowsazure/jj129582.aspx. Please remember to add .LastOrDefault() when you query so you get the latest media processors, as we constantly upgrade our media processors for better performance.

7. Once encoding and packaging are done, we could ask for SAS locator and streaming locator. They could be used to stream your content from a media application.

Questions: what’s the different between SAS locator and Origin Streaming locator?

SAS locator is also called Share Access Signature locator, which is the URL for accessing any files from Blob storage with certain access policy, such as when you want the file to be expired. If your media file is a progressive download video file, such as .Mp4, then you should get a SAS locator for your media application to access it. However, if you want to have better video experience, probably you want to use some adaptive streaming formats, such as Smooth Streaming or Http-Live-Streaming. Therefore, we have Origin server sit in between your client applications and video storage to answer all queries come from client application – if it asks for 2 seconds of video fragment at certain time, origin server could intelligent serve the video bits out. Therefore, we create Origin streaming URL, which maps to physical video storage SAS URL so Origin server could be utilized in the middle. Below is a example of how SAS locator and how Origin streaming locator look like:

Streaming locators

 

2 Responses to Using Windows Azure Media Services .NET SDK with key concepts explained
  1. Tom Kerkhove Reply

    Hi Ming Feiy,

    Very interesting post, even for a media dummy like me.
    I coded along but was wondering what the GetStreamingUrl-method does and how I can view my result?

    Will the method also publish the assets?

    Thanks!

Leave a Reply

Your email address will not be published. Please enter your name, email and a comment.