Axinom Encoding Service sends events to defined endpoints whenever there's a change in the job state. Find out which events are included and what data they provide

Event List

Axinom Encoding progress tracking is built on the "push" model. This means that Encoding sends events to defined endpoints when a job’s state changes using message publishers. Each such event contains useful information. It is up to your application to listen for these events and to define the handling behavior.

An overview of all events with the phases where they are emitted and the related configuration sections is available inside the Encoding Overview.

All events contain the same 5 base fields + their specific data. The base fields are:

{
  "JobId": "7760f88f-6382-43a6-8957-7eda3416ad47",
  "TenantId": "your-api-user-name",
  "ExternalId": "123456",
  "ExternalType": "movie",
  "ExternalProvider": "ABC"
}

Since Axinom Encoding is completely a stateless service, use them to correlate events with jobs.

JobCreated

This event is sent when the pre-validation process is complete and a job is ready to start processing. It contains request data in its body (except sensitive data) and echoes it back.

{
    "ContentAcquisitionProvider": "Ftps",
    "ContentAcquisitionUriPath": "ftps://ftp.somewhere.com/ingest/12345678",

    "MessagePublisherTypes":[
        "FtpFileMessagePublishing",
        "AmazonSqsMessagePublishing"
    ],
    "ContentPreProcessing": {
        "VideoStreamExpression": "^.*\\.(mp4|avi|mov|wma)$",
        "AudioFileLanguageExpression": "^[^-]*-([a-zA-Z0-9\\-]+).aac$",
        "SubtitleFileLanguageExpression": "^[^-]*-sub-([a-zA-Z0-9\\-]+).vtt$",
        "CaptionFileLanguageExpression": "^[^-]*-cc-([a-zA-Z0-9\\-]+).vtt$",
        "AudioStreams": [
            {
                "Language": "en",
                "FileNameExpression": "pattern"
            }
        ],
        "SubtitleStreams": [
            {
                "Language": "en",
                "FileNameExpression": "pattern"
            }
        ],
        "CaptionStreams": [
            {
                "Language": "en",
                "FileNameExpression": "pattern"
            }
        ]
    },
    "ContentProcessing": {
        "OutputFormat": ["Dash", "Hls"],
        "VideoFormat": "H264",
        "OptimizeFor": "Balance",
        "Archiving": "None",
        "VideoRepresentations": [
            {
                "BitrateInKbps": 3000,
                "Width": 1920,
                "Height": 1080
            }
        ],
        "UseHighestPossibleBitrate": true,

        "ArchiveOutputName": "custom_name",

        "MaxArchiveSize": 1000000
    },
    "ContentPublishingProvider": "Ftps",
    "ContentPublishingUriPath": "ftps://ftp.somewhere.com/publish/123456"
}

AcquisitionProgress

Phase: Acquisition

This event is sent during the source files acquisition. If the files are downloaded fast enough, it produces only one such acquisition message directly with 100% progress. For slower connections or larger files, the progress is reported multiple times (up to 100 such messages) with a different progress value.

{
  "Progress": 30
}

ContentAcquired

Phase: Acquisition

This event is sent after all the files have been downloaded.

{
  "FileCount": 8,
  "TotalFilesSizeInBytes": 48431964
}

ContentMapped

Phase: Media Mapping

This event is sent when Encoding has finished mapping files to video/audio/subtitle with the corresponding language.

{
  "VideoStreamMapping": {
    "FileName": "video.mp4",
    "StreamIndex": 0,
    "Language": "und",
    "SelectedViaRegex": true,
    "MediaType": "Video"
  },
  "AudioStreamMappings": [
    {
      "FileName": "video.mp4",
      "StreamIndex": 1,
      "Language": "en",
      "SelectedViaRegex": true,
      "MediaType": "Audio"
    }
  ],
  "SubtitleStreamMappings": [
    {
      "FileName": "subtitle-en.vtt",
      "StreamIndex": 0,
      "Language": "en",
      "SelectedViaRegex": true,
      "MediaType": "Subtitle"
    }
  ],
  "AllFoundStreams": [
    {
      "FileName": "video.mp4",
      "StreamIndex": 0,
      "Language": "und",
      "SelectedViaRegex": true,
      "MediaType": "Video"
    }
  ],
  "FileMappingResults": [
    {
      "FileName": "video.mp4",
      "FileSizeInBytes": 0,
      "UsedVideoStream": true,
      "UsedAudioStreams": 1,
      "UsedSubtitleStreams": 0
    }
  ]
}

When using the packaging-only mode, instead of a single object VideoStreamMapping, you shall receive an array with such elements (VideoStreamMappings), as there are multiple video source files possible.

{
  "VideoStreamMappings": [
    {
      "FileName": "video.mp4",
      "StreamIndex": 0,
      "Language": "und",
      "SelectedViaRegex": true,
      "MediaType": "Video"
    },
    ...
  ],
  ...
}

ContentPreProcessed

Phase: Media Mapping

This event is sent after all the files have been acquired and all streams mapped and extracted.

{
  "AudioLanguages": [
    "en",
    "de"
  ],
  "SubtitleLanguages": [
    "en",
    "de",
    "fr",
    "es"
  ]
}

VideoEncodingStarted

Phase: Encoding

This event is sent when a task has started encoding.

{
  "VideoEncodingBitrates": [
    400,
    800,
    1200,
    2100
  ],
  "UseDrm": true,
  "OptimizeFor": "Quality",
  "VideoFormat": "H264"
}

EncodingProgress

Phase: Encoding

The progress of the video encoding task. Encoding shall send such events multiple times, up to 100 maximum.

{
  "Progress": 33
}

EncodingFinished

Phase: Encoding

The event is sent when Encoding has finished encoding video, audio & subtitles as well as packaged them into Dash or HLS. If you requested DRM protection, then this event includes the ProtectedStreams list.

Each object in this list has the following data:

  • Label - stream group (group of the streams with the same quality level: audio, sd, hd, uhd1)

  • File - protected file

  • KeyId - key used to protect this stream

  • Iv - encryption initialization vector used for encryption

Note
For CENC protected outputs, such as Dash, Dash On Demand, we use 8 bytes IV.
For CBCS protected outputs, such as HLS and CMAF, we use 16 bytes IV.
{
  "ProtectedStreams": [
    {
      "Label": "audio",
      "File": "audio-en.mp4",
      "KeyId": "5a601de9-6075-461b-955c-0a155b93b0d3",
      "Iv": {
        "Dash": "468837FB19FD8F68",
        "Hls": "468837FB19FD8F68BE246D2A738D2598"
      }
    },
		{
      "Label": "sd",
      "File": "video-H264-216-300k.mp4",
      "KeyId": "ae355f06-2f01-49d6-a090-6d2ecd486f55",
      "Iv": {
        "Dash": "090A0C7269CFD641",
        "Hls": "090A0C7269CFD641F49F062A8347C843"
      }
   	},
		{
      "Label": "hd",
      "File": "video-H264-720-300k.mp4",
      "KeyId": "be355f06-2f01-49d6-a090-6d2ecd486f55",
      "Iv": {
        "Dash": "890A0C3369CFD664",
        "Hls": "890A0C3369CFD664F49F062A8347C812"
      }
    }
  ]
}

PackagingStarted

Phase: Packaging

The event is sent when a task has started packaging.

PackagingFinished

Phase: Packaging

The event is sent when Encoding has finished packaging video, audio & subtitles as well as packaged them into Dash or HLS. If you requested DRM protection, then this event includes the ProtectedStreams and Streams list.

Each object in ProtectedStreams list has the following data:

  • Label - stream group (group of the streams with the same quality level: audio, SD, HD, UHD1)

  • File - protected file

  • KeyId - key used to protect this stream

  • Iv - encryption initialization vector used for encryption

Note
For CENC protected outputs, such as Dash, Dash On Demand, we use 8 bytes IV.
For CBCS protected outputs, such as HLS and CMAF, we use 16 bytes IV.
{
  "ProtectedStreams": [
    {
      "Label": "audio",
      "File": "audio-en.mp4",
      "KeyId": "5a601de9-6075-461b-955c-0a155b93b0d3",
      "Iv": {
        "Dash": "468837FB19FD8F68",
        "Hls": "468837FB19FD8F68BE246D2A738D2598"
      }
    },
		{
      "Label": "sd",
      "File": "video-H264-216-300k.mp4",
      "KeyId": "ae355f06-2f01-49d6-a090-6d2ecd486f55",
      "Iv": {
        "Dash": "090A0C7269CFD641",
        "Hls": "090A0C7269CFD641F49F062A8347C843"
      }
   	},
		{
      "Label": "hd",
      "File": "video-H264-720-300k.mp4",
      "KeyId": "be355f06-2f01-49d6-a090-6d2ecd486f55",
      "Iv": {
        "Dash": "890A0C3369CFD664",
        "Hls": "890A0C3369CFD664F49F062A8347C812"
      }
    }
  ]
}

Each object in Streams list has the following data of specified output format, will produce multiple in case multiple output formats are selected e.g Hls and Dash. Empty fields are sent with null value:

  • Video - each object contains data on all packaged video representations

    • BitrateInKbps - refers to the number of bits that are transferred in a second in Kbps

    • Codecs - Codec used to compress the video (H264, H265)

    • DisplayAspectRatio - relationship of the width of a video image compared to its height in fraction format, most commonly 16:9

    • File - video file name as it appears in the output folder

    • FileTemplate - video file template naming scheme if it’s fragmented

    • FrameRate - video frames per second

    • Width - video width

    • Height - video height

    • Iv - encryption initialization vector used for encryption if DRM is used. Otherwise null

    • KeyId - key used to protect this stream if DRM is used. Otherwise null

    • Label - video stream group (SD, HD, UHD1)

    • OutputFormat - packaging output format (HLS, Dash, DashOnDemand, Cmaf)

    • PixelAspectRatio - relationship of the width of a pixel compared to its height in fraction format, most commonly 1:1

  • Audio - each object contains data on all packaged audio files

    • BitrateInKbps - refers to the number of bits that are transferred in a second in Kbps

    • Codecs - Codec used to compress the audio (AAC)

    • File - audio file name as it appears in the output folder

    • FileTemplate - audio file template naming scheme if it’s fragmented

    • Iv - encryption initialization vector used for encryption if DRM is used. Otherwise null

    • KeyId - key used to protect this stream if DRM is used. Otherwise null

    • Label - audio stream group (audio)

    • LanguageName - full name of the language present in audio file

    • LanguageCode - 2-digit language code of the language present in audio file

    • OutputFormat - packaging output format (HLS, Dash, DashOnDemand, Cmaf)

    • SamplingRate - the number of samples of audio carried per second, measured in Hz, presented as a number

  • Subtitle - each object contains data on all packaged subtitle files

    • BitrateInKbps - Refers to the number of bits that are transferred in a second in Kbps

    • Codecs - null

    • File - subtitle file name as it appears in the output folder

    • FileTemplate - subtitle file template naming scheme if it’s fragmented

    • Label - subtitle stream group (subtitle)

    • LanguageName - full name of the language present in subtitle file

    • LanguageCode - 2-digit language code of the language present in subtitle file

    • OutputFormat - packaging output format (HLS, Dash, DashOnDemand, Cmaf)

  • ClosedCaption - each object contains data on all packaged closed caption files

    • BitrateInKbps - Refers to the number of bits that are transferred in a second in Kbps

    • Codecs - null

    • File - closed caption file name as it appears in the output folder

    • FileTemplate - closed caption file template naming scheme if it’s fragmented

    • Label - closed caption stream group (closed-caption)

    • LanguageName - full name of the language present in closed caption file

    • LanguageCode - 2-digit language code of the language present in closed caption file

    • OutputFormat - packaging output format (HLS, Dash, DashOnDemand, Cmaf)

Note
For CENC protected outputs, such as Dash, Dash On Demand, we use 8 bytes IV.
For CBCS protected outputs, such as HLS and CMAF, we use 16 bytes IV.
{
  "Streams":
	[
		{
            "Video":
			[
				{
					"BitrateInKbps": 300,
					"Codecs": "H264",
					"DisplayAspectRatio": "26:15",
					"File": "hls/video-H264-240-300k.m3u8",
					"FileTemplate": "hls/video-H264-240-300k_$Number$.ts",
					"FrameRate": 29.97,
					"Height": 240,
					"Iv": null,
					"KeyId": null,
					"Label": "SD",
					"OutputFormat": "Hls",
					"PixelAspectRatio": "1:1",
					"Width": 416
				}
			],
			"Audio":
			[
				{
					"BitrateInKbps": 128,
					"Codecs": "AAC",
					"File": "hls/audio-en.m3u8",
					"FileTemplate": "hls/audio-en_$Number$.ts",
					"Iv": null,
					"KeyId": null,
					"Label": "audio",
					"LanguageCode": "en",
					"LanguageName": "English",
					"OutputFormat": "Hls",
					"SamplingRate": 44100
				}
			],
			"ClosedCaption":
			[
                {
					"BitrateInKbps": 0,
					"Codecs": null,
					"File": "hls/caption-en.m3u8",
					"FileTemplate": "hls/caption-en_$Number$.vtt",
					"Label": "closed-caption",
					"LanguageCode": "en",
					"LanguageName": "English",
					"OutputFormat": "Hls"
				}
			],
			"Subtitle":
			[
				{
					"BitrateInKbps": 0,
					"Codecs": null,
					"File": "hls/subtitle-de.m3u8",
					"FileTemplate": "hls/subtitle-de_$Number$.vtt",
					"Label": "subtitle",
					"LanguageCode": "de",
					"LanguageName": "German",
					"OutputFormat": "Hls"
				}
			]
		}
	]
}

ImagesExtracted

This event is sent after successful images extraction. Refer to the Images Extraction for more details.

{
    "RelativePath": "ftpes://server.ftp.com/target/dir/",
    "Images": [
        "preview_1.jpg",
        "preview_2.jpg",
        "preview_3.jpg"
    ]
}

ContentPublished

Phase: Publishing

This event is sent after all files have been published to the external target location.

{
  "Provider": "Ftps",
  "RelativePublishStoragePath": "published/123456",
  "ManifestFileName": "Manifest.mpd",
  "FileCount": 3756,
  "TotalSizeInBytes": 1453572213
}

JobSuccess

Phase: Publishing

This message is sent after the full encoding job has been finished successfully, the packaged video is published, and all notifications have been sent.

{
    "Output": {
        "Dash": "https://full.pathtostorage.tld/published/123456/Manifest.mpd",
        "Hls": "https://full.pathtostorage.tld/published/123456/Manifest.m3u8"
    }
}

FinalError

Phase: Publishing

This event is sent in case of a fatal job processing failure. The message body contains an error code that describes the actual problem occurred. Check out the error codes for more information.

{
  "ExceptionMessage": "Could not finish the encoding job.",
  "ErrorCode": 5000
}

Revision History

The table below lists the document versions and any changes to them.

Version Date Description

1.0

October 21, 2020

  • Initial version.

2.0

October 6, 2021