Skip to end of metadata
Go to start of metadata

Project Overview

One of the features that was discussed for the Asterisk 13 Projects was the ability to playback media from a URI to a channel or bridge. See for more information.


The conversation on the mailing list included both playback of a URI, as well as allowing for a unicast stream of media to be injected into a channel/bridge. At this time, that would be considered a separate project from this one.

The primary reason to add this feature is scalability. Allowing sounds to be placed on a remote HTTP server allows a cluster of Asterisk servers to access and pull down the sounds as needed. This is much easier for system administration, as the sounds don't have to all be pushed to individual Asterisk machines.

Requirements and Specification


Normally, Asterisk eschews usage of a file extension and instead picks the best extension available based on the native formats of the channel. In the case of a remote URI, this impractical as the URI represents a unique location to retrieve. The optimal file extension may not be present, and checking for each file extension may be costly.

As such, the extension should be provided of the resource to retrieve; however, that may not result in a playable file depending on the channel formats. That is expected behaviour.

URI Schemes

Playback of a remote URI should support both HTTP and HTTPS.


When a sound file has been retrieved, it should be cached on the local server. The management of the file should follow standard HTTP caching rules - for more information, see:

Generally, the rules should follow as such:

  • If a Cache-Control header is included, Asterisk will obey whatever rules it specifies. In particular, the following should be checked:
    • If no-cache is specified, the resulting file is marked as always 'dirty' - that is, we have to always retrieve a new resource from the server.
    • If no-store is specified, Asterisk must attempt to purge the information before Asterisk shutdown and will always retrieve the resource from the server.


      We can't play back the file if we don't store it temporarily on disk. With file streams, we may not get notified when the playback completes, so it may be difficult to purge it out of the cache until Asterisk shuts down. If Asterisk does shut down however, we have an opportunity to flush it out of any semi-permanent storage we may have set up.

    • s-maxage or max-age: mark the file as 'dirty' after so many seconds.
  • If the cached resource is 'dirty', look at the E-Tag header on the remote resource and compare it to the local E-Tag. If the two are different, retrieve the resource and update the entry in the cache. If the cached resource was retrieved with no-store, then we always retrieve the full resource.

CLI commands and an ARI resource should be provided to view the local cached files, the last time they were updated, and their originating URIs. Similar mechanisms should be provided to flush the entire cache.


Video is particularly difficult, as it typically requires multiple files (one video, one audio) to be useful. Container formats are beyond the scope of this project. To support video, remote playback of URIs should support a parallel form of extraction using the pipe | character:

same => n,Playback(|

The behaviour of the files retrieved in such a fashion are as follows:

  • Playback does not begin until all files are retrieved.
  • Files are treated individually in the cache.
  • While files with different names can be played back, the first file in the parallel list will be the actual "core sound file" played back. That is, assuming monkeys.h264 and monkeys.wav already existed on the server, the Playback operation would operate on sound file monkeys, which would play both the audio and video files.


The cached entries and their metadata should be stored in the AstDB. When Asterisk starts, the entries in the AstDB should be verified to still be accurate (that is, that the local files still exist), and if so, the buckets for them re-created.

Playback of a URI

Playback of a URI can be done via any of the supported playback operations, through any API. This functionality should be a part of the Asterisk core, and available through any of the media playback operations.

same => n,Playback(

Note that this can be combined with multiple URIs or sounds to form a playlist:

same => n,Playback(

Since an & is valid for a URI but is also used as a separator in dialplan, ampersands in a resource cannot be supported. If an ampersand is used in a URI (say, as part of a query), then the entire URI must be URI encoded.


URIs in a resource can't be supported, as performing a URI decode on the URI cannot tell the difference between an & in a resource and an & in a query. That is: =>

The latter cannot be decoded correctly as the & that forms the query parameter cannot now be distinguished from the already URI encoded & in the resource.


Playback can be done on either a channel or a bridge. Since the resource being requested is a URI, it should be presented in the body of the request encoded using text/uri-list (see RFC 2483). Note that we still need to provide a media query parameter that specifies the resource type to play - the actual 'entity' being played back is simply specified in the body of the request.


Content-Type: text/uri-list

This format works nicely with simple playlists, as it can specify multiple files to retrieve. Note that these files should be played back sequentially as a playlist (which is not yet supported, but will need to be by the time we get here!)


Content-Type: text/uri-list
# Audio File One (I'm a comment and should be ignored)
# Comment comment comment

Note that when the Content-Type is text/uri-list, the resource specified by the uri media scheme is simply tossed away, as we can only have a single list of URIs. Note that this approach is somewhat limiting in that supporting multiples

Another option is to provide the URIs in JSON. This would allow parallel playback of files for video support. Note that the JSON must follow the following schema:


    "media": "playlist",
    "playlist": [
                  { "media_group": [
                                "scheme": "uri",
                                "resource": ""
                                "scheme": "uri",
                                "resource": ""
                 { "media_group": [
                              { "scheme": "uri",
                                "resource": "",

There's some obvious differences here:

  1. The schema type must be playlist. We can't use a uri schema type, as doing so would prevent combining URI resources with other media resource types for playlists.
  2. The playlist must be strongly typed. Otherwise, swagger code generation turns into a nightmare. As such, that means a new strongly named body parameter must be used.


No configuration should be needed.


Core - media_cache

This new file in the Asterisk core will provide the following:

  • A universal API that the core and rest of Asterisk can do to query for something in the media cache.
  • Caching of the bucket information in the AstDB
  • Informing the bucket implementation that a new bucket has been created during startup (or otherwise creating the buckets itself)
  • CLI commands

Implementations of a cache should implement the bucket API for a particular scheme.


CLI Commands

core show media-cache

Just for fun! It'd be good to see what is in the cache, the timestamps, etc.

*CLI>core show media-cache

URI                                  Last update                Local file      2014-10-30 00:52:25 UTC    /var/spool/asterisk/media_cache/ahsd98d1.wav      2014-09-14 10:10:00 UTC    /var/spool/asterisk/media_cache/77asdf7a.wav     2014-09-14 10:10:00 UTC    /var/spool/asterisk/media_cache/77asdf7a.h264 

3 items found.

Note that the last two files would have been created using a preferred file prefix. This allow the file and app core to "find" both the audio and the video file when opening up the stream returned by the file_path by the media_cache.

core clear media-cache

This is really a handy way for a system administrator to force files to be pulled down.

*CLI>core clear media-cache

3 items purged.

*CLI>core show media-cache

URI                                  Last update                Local file
0 items found.


File Retrieval

A module should be implemented that does the actual work of using libcurl to retrieve a media file from the system subject to caching constraints, if they exist. It should:

  • Implement the HTTP caching, noted previously
  • cURL files down to a local file, create a bucket, and store the bucket along with the appropriate metadata

Core - Usage of ast_openstream

Prior to call ast_openstream, users who want to support URI playback should first:

  • Determine if the provided "file" is actually a URI. A simple 'strncmp' is sufficient for this.
  • If so, ask the cache for the actual file. Use that for subsequent calls to ast_openstream and ast_openvstream.
  • If not, move on as normal.

There are other callers of ast_openstream, but it's probably not worth updating ExternalIVR (sorry (sad))

Core - file.c::ast_streamfile

This covers most functionality, as most dialplan applications (and other things) end up calling ast_streamfile.

AGI - res_agi.c::handle_streamfile | handle_getoption

AGI implementations of basic sound playback, which emulate their dialplan counterparts. These will need to be updated in the same way as ast_streamfile.

Core - HTTP server

Support for a new Content-Type, text/uri-list, needs to be added to the HTTP server. Since we typically parse the body as JSON using ast_http_get_json, this should be a new public function ast_http_get_uri_list that pulls a body as a text/uri-list. Since we now have core support for URIs, a bit of basic support for a list of URIs should be added to uri.h and used by the HTTP server.




Swagger Schema

Providing a 'type' for the media parameter is a bit tricky. We have the following situation:

  • If the media parameter does not specify a resource type of uri, treat it as a string.
  • If the media parameter does specify a resource type of uri, look to the body as a URI list.
  • If the media parameter is in the body, however, it must be of type string. Hence, we cannot specify the URIs directly in the media parameter. Thus, to specify URIs, a body parameter of playlist must be provided and the URIs provided in a Playlist model object.

Note that the following for channels.json would be repeated for bridges.json:



Mustache Templates

The Mustache templates generated will need to be modified to check for text/uri-list as a possible body type:

  • The existing body_parsing.mustache should be renamed to note that it parses JSON. The caller of the body_parsing routines should be made to only look at the results if the function returns non-NULL.
    • If it does return NULL, it should also check for a text/uri-list using the new function in http.h.
    • The body parsers should be updated for a playback operation to return the structured Playback object.
  • A new text_uri_body_parser should be added that parses a body into a struct ast_uri_list. This should be hard-typed to convert the URI list into a structured Playback object.


    This is limiting, but for now, we don't have any use for a URI list in ARI outside of specifying a list of media resources. If that assumption proves false later, that code should be re-visited.

Test Plan

Unit Tests

/main/media_cache/existsnominalNominal test that verifies a valid item in the cache can be found
/main/media_cache/existsnon_existentTest that verifies that an item in the cache that doesn't exist isn't located
/main/media_cache/existsbad_schemeVerify that we reject a URI with an unknown or unsupported scheme
/main/media_cache/existsbad_paramsPass in bad parameters, make sure we get back a "huh?"
/main/media_cache/retrievenominalRetrieve a file from the Asterisk HTTP server (using magic!) and make sure it is in the cache. Retrieve it again and make sure we didn't do an HTTP request twice.
/main/media_cache/retrievenon_existentAsk for a file that doesn't exist. Get an error back.
/main/media_cache/retrievebad_schemeAsk for something that has a bad URI scheme.
/main/media_cache/retrievebad_paramsAsk for something with no URI and no file_path. Make sure we reject it.
/main/media_cache/retrievenew_fileRetrieve a media file, cache it. Update the file, ask for the file again; make sure it gets the new copy.
/main/media_cache/retrievepreferred_file_nameAsk for a file with a preferred file name; verify that we retrieve the file and set the file name accordingly (with the right extension).
/main/media_cache/deletenominalPut something in the cache. Call delete; verify the cache is purged.
/main/media_cache/emptynominalCall delete on an empty cache; make sure everything is cool.
/main/media_cache/updatecreateVerify that a new item in the cache can be created
/main/media_cache/updateupdateVerify that an existing item in the cache can be updated
/main/urito_stringTest that a constructed URI can be converted back to a string
/main/urilist_ctorVerify that we can make a URI list
/main/urilist_append_nominalTest appending URIs to a list
/main/urilist_append_off_nominalTest appending things that aren't a URI, and make sure we fail appropriately
/main/uriiterator_ctorTest nominal creation of an iterator
/main/uriiterator_ctor_off_nominalTest off-nominal creation of an iterator with a bad list
/main/uriiterator_iterationTest nominal iteration over lists. Include empty lists.

Asterisk Test Suite





Asterisk Test Suite

A regression test that verifies that the current read functionality of CURL is maintained


Asterisk Test Suite

Verify that we can cURL a file down and store it
funcs/func_curl/curl_opt/timestampAsterisk Test SuiteVerify that we can retrieve the timestamp from a resource
tests/apps/playback/uriAsterisk Test SuiteVerify that the Playback dialplan application can playback a remote URI
tests/apps/control_playback/uriAsterisk Test SuiteVerify that the ControlPlayback dialplan application can playback a remote URI
tests/fastagi/stream-file-uriAsterisk Test SuiteVerify that the AGI Stream File command can play a URI
tests/fastagi/control-stream-file-uriAsterisk Test SuiteVerify that the AGI Control Stream File command can play a URI
tests/rest_api/channels/playback/playlistAsterisk Test SuiteVerify that a Playback resource that contains a playlist can be played back to a channel
tests/rest_api/bridges/playback/playlistAsterisk Test SuiteVerify that a Playback resource that contains a playlist can be played back to a bridge
tests/rest_api/channels/playback/uriAsterisk Test SuiteVerify that a Playback resource can be created from a URI for a Channel
tests/rest_api/bridges/playback/uriAsterisk Test SuiteVerify that a Playback resource can be created from a URI for a Bridge

Project Planning

The following are rough tasks that need to be done in order to complete this feature. These are meant to be guidelines, and should not necessarily be followed verbatim. Note that many of these are actually independent of each other, and can be worked out simultaneously. If you're interested in helping out with any of these tasks, please speak up on the asterisk-dev mailing list!

The various phases are meant to be implemented as separately as possible to ease the process of peer review.

Phase One - Core Media Cache

See peer reviews:

Implement the basic APIMask callbacks into the bucket API based on the URI scheme being requested. Add handling for manipulating the created bucket's local file if the predefined filename is provided.Done
Integrate with the AstDB

When items are created via a bucket create or retrieve, update entries in the AstDB.

When Asterisk is started, re-create buckets based on the entries currently in the AstDB.

Add CLI commandsBoth for showing elements in the media cache as well as for purging the cache. If the cache is purged, remove entries from the AstDB.Done

Phase Two - res_http_media_cache

Create http_media_cache. Add bucket wizards for schema types http and https.
  • Define basic unit tests for creation/deletion
  • Implement bucket wizard callbacks for basic manipulation
Generally, get the basic structure of the the thing defined, implement unit tests, and make sure the unit tests fail. TDD.Done
Implement retrieval. Use the underlying CURL function to retrieve a provided URI and store as a temporary file (or use the predetermined filename).

A few observations:

  • This shouldn't worry about marking the cache entries as dirty yet using the caching rules. This should:
    • Check to see if we have a bucket with the URI. If not, create the bucket.
    • cURL the resource down into the provided file path.
    • Update the bucket entry with the now locally stored file.
    • Return the location of the local file on the file system.
Implement 'dirtying' of cache entry based on HTTP caching rules.Implement handling of E-Tags as well as the Cache-Control header.Done

Phase Three - Core/dialplan/AGI implementations

Update the file core to use the http_media_cacheUpdate the core file users of ast_openstream to first look for the resource in the http_media_cache. If found, use the returned file.Done
Update the dialplan usersSame as the core, except for dialplan functions.Done
Add tests for Playback and ControlPlayback Done
Update the AGI usersSame as the core, except for res_agi functions.Done
Add tests for stream file and control stream file Done

Phase Four - ARI Playlists


This is actually a completely separate and super useful feature. URI playbacks really need it to function so... here it is.

Note that this does not envision complete playlist control (such as 'skip to next sound in the playlist'). That could be added either as part of this work or at a future date.

Update the JSON schema with JSON playlistsThis should require updates to the Playbacks model, as well as the play operations for channels and bridges. The mustache templates may need to be updated to properly extract this complex of a body type.Not Done (see Note below)
Re-generate stubs; add connecting logicRe-generating the stubs will create a new body handler and a more complex 'playlist' object that can be optionally present. This will be passed to the resource_channels and resource_bridges operations.Not Done (see Note below)
Add a 'playlist' media resource type
  • Update resource_channels:ari_channels_handle_play and resource_bridges:ari_handle_play (boiling down to ari_bridges_play_helper ) to understand a body playlist. This should pass it off to stasis_app_control_play_uri or an equivalent function for actual handling.
  • Update stasis_app_control_play_uri or add an equivalent function to actually play the list.
Not Done (see Note below)
Update res_stasis_playbackThe various function calls boil down to play_on_channel in res_stasis_playback. This is passed the actual Playback resource object, which now can contain a Playlist. The function should be updated to parse out the various items in the playlist and pass them to ast_control_streamfile_lang.Not Done (see Note below)
Add rest_api tests for playlists. Not Done (see Note below)

Arguably, we don't really need a 'playlist' media resource type. Lists of media are now played back in sequence by simply specifying multiple media URIs in a sequence, e.g., media=sound:foo.wav,sound:bar.wav, or as a list, e.g., media=sound:foo.wav,media=sound:bar.wav.

This works as well for remote URIs, although admittedly the syntax is a bit clunky right now:


Phase Five - HTTP Server Updates

Update uri.h to support URI lists and URI iterators.Add unit tests!Not Done
Update http.h to support body types of text/uri-list. Generate a ast_uri_list as a result of said body type. Not Done

Phase Six - ARI text/uri-list support/URI playbacks

Update mustache templates to understand a body type of text/uri-list. Re-generate appropriate stubs.Body parsing should be made to handle both JSON as well as the text/uri-list body type. A text/uri-list body parser can be made to explicitly return Playback objects suitable for consumption in the existing playlist mechanisms.Not Done
Wire generated code to resource_channelsresource_bridges. Update API callbacks as needed.

Generally, this should "just work" (or nearly) at this point. We have:

  • Understanding of URI playbacks in the core
  • Understanding of playlists in ARI
  • The ability to handle URI playbacks (including parallel playbacks) in ast_control_streamfile_lang, which is what is used by res_stasis_playback:play_on_channel.

Most of this should be just gluing the pieces together as needed.

Not Done
Add URI playback tests to Asterisk Test Suite. Not Done

JIRA Issues

ASTERISK-25654 - Playback: Add the ability to play remote URIs Closed


Reference Information

  • No labels


  1. Instead of including the URL for playback in the query string in ARI, can we not support POSTing some document for rendering? In the simplest case, this could be a URI list (see, and in the future with TTS it could be an SSML document or similar. This provides continuity, simplicity in listing multiple URLs, and no concerns about URL encoding.

    1. Yup, that's a lot better. I'll update the document - thanks!

  2. So, I don't have a solution to propose yet, but I just wanted to register an objection while I think through the solution.

    It seems to me like parallel playback as the default is very surprising. If I were to submit a list of audio files, my expectation would be that they would be played back sequentially. This is also consistent with SSML, although SSML does not provide a mechanism for parallel playback aside from multiple concurrent resources.

    1. Yeah, I can't say I like the idea of a parallel playback, but I can see some use for it for video. Since Asterisk doesn't have support for media containers (which would be great, but is a large and separate project), you have to pull back the video file and the audio file as separate actions. However, you don't want to start playback until you have both, which means there has to be some nomenclature for retrieving two resources as a single operation, while also supporting a sequential list of URIs to operate on.

      Structured data lends itself well to this problem, so another option would be to use a JSON body. Something like:

  3. I would prefer for parallel playback to be the special case, and therefore be supported by some custom (to Asterisk) format such as that you suggest, while the standardised format (text/uri-list) retains the simpler and least surprising behaviour of sequential playback.

    VoiceXML itself does not contain any functionality for parallel playback, but Dialogic's extensions do: As far as I can tell, MRCP does not contain any specific support for parallel synthesis, nor does Rayo. I'm struggling to find further precedence, though I understand the proposed use case.

    1. I had the same thought on the drive in to work. Thinking about it some more, the only time parallel retrieval should be used is for video (since Asterisk doesn't support multiple audio streams). Really, this feature is of limited use for video: any reasonable sized video file is going to be quite large and would incur a pretty negative penalty for retrieval. I'm not saying it won't be used, just that it doesn't seem as useful as audio file.

      I'll go back through the document and update it accordingly.

  4. Is there a compelling reason for supporting text/uri-list?  Sure, it's a trivial markup and a MIME-type, but with the rest of the API being JSON-based, do we really save something by using an alternate body scheme.... particularly since it is insufficient to handle the general case?  Certainly it doesn't hurt anything by having it, but I just don't see a compelling reason for it.  If you are interacting with ARI, you are capable of constructing JSON.

    1. Because this is a lowest common denominator standard for TTS engines. If Asterisk can support it natively also for file playback as well as supporting passing it (or SSML) through to a TTS engine for rendering, it makes consumers of the API more consistent.

      1. Ah, that's reasonable, then

        1. I agree that it's reasonable, although for now, I'm kind of holding off working on it.

          The existing implementation, while a bit clunky, will accept an HTTP/HTTPS URI to playback:


          That doesn't solve the case of video, where you need to pull back both the video media as well as the audio media and start the playback of both simultaneously, but in the case of Asterisk, that's kind of an edge case. (Arguably, this feature - which doesn't yet support streaming media - isn't really suitable for video in the first place.)

  5. Pardon my ignorance if so, but what is the purpose of the additional `media=` prefix?  We will already be inside a media context, and so far as I have seen (I only know ARI), the media URIs are simply of the form `<type>:<value>` (e.g. `sound:tt-monkeys` or `tone:!900/500,!0/500`.

    1. Idiot me; you're just including the key for the URI parameter.  Never mind.