
Recherche avancée
Autres articles (6)
-
Selection of projects using MediaSPIP
2 mai 2011, parThe examples below are representative elements of MediaSPIP specific uses for specific projects.
MediaSPIP farm @ Infini
The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...) -
Other interesting software
13 avril 2011, parWe don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
We don’t know them, we didn’t try them, but you can take a peek.
Videopress
Website : http://videopress.com/
License : GNU/GPL v2
Source code : (...) -
Sélection de projets utilisant MediaSPIP
29 avril 2011, parLes exemples cités ci-dessous sont des éléments représentatifs d’usages spécifiques de MediaSPIP pour certains projets.
Vous pensez avoir un site "remarquable" réalisé avec MediaSPIP ? Faites le nous savoir ici.
Ferme MediaSPIP @ Infini
L’Association Infini développe des activités d’accueil, de point d’accès internet, de formation, de conduite de projets innovants dans le domaine des Technologies de l’Information et de la Communication, et l’hébergement de sites. Elle joue en la matière un rôle unique (...)
Sur d’autres sites (4086)
-
Issues with processing media on windows Azure
23 septembre 2015, par Ahmed MujtabaI have a website built on ASP.NET web forms that works as a media portal for users to upload videos. I’m using ffmpeg encoders to produce video contents to be streamed in the browser. I’m using the web deploy method to publish the site on the Azure server. The website get’s deployed properly however I get following issues in the live site.
-
Video never get’s encoded and published. I get some sort of error.
-
Video get’s published but the process of uploading and encoding the video is way too slow on the web server.
My project solution contains upload.ashx that handles the upload requests and makes the call to encode.ashx which is responsible for the encoding and publishing of the videos. I tried to remotely debug the site but the debugger never get’s to encode.ashx.
I was wondering if these issues can be resolved by having the website deployed with a VM ?
Script that uploads the video file :
var filesuploaded = 0;
var faileduploaded = 0;
$(function () {
var uploader = new plupload.Uploader({
runtimes: 'gears,html5,flash,silverlight,browserplus',
browse_button: '<%= pickfiles.ClientID %>',
container: 'container',
max_file_size: '<%= MaxMediaSize %>mb',
url: '<%=Config.GetUrl() %>videos/upload/upload.ashx',
flash_swf_url: '<%=Config.GetUrl() %>plupload/js/plupload.flash.swf',
silverlight_xap_url: '<%=Config.GetUrl() %>plupload/js/plupload.silverlight.xap',
chunk_size: '4mb',
<%= UniqueNames %>
filters: [
{ title: '<%= AllowedFormatsDisplay %>', extensions: '<%= AllowedFormats %>'}],
headers: { UName: '<%=UserName %>', MTP: '<%= MediaType %>' }
});
//uploader.bind('Init', function (up, params) {
// $('#filelist').html("<div>Current runtime: " + params.runtime + "</div>");
//});
uploader.init();
$('#uploadfiles').click(function (e) {
uploader.start();
e.preventDefault();
$("#uploadfiles").hide();
$("#<%= embd.ClientID %>").hide();
});
uploader.bind('FilesAdded', function (up, files) {
$("#uploadfiles").show();
$("#<%= msg.ClientID %>").html("");
var count=0;
$.each(files, function (i, file) {
$('#filelist').append(
'<div class="item_pad_4 bx_br_bt">' + (count + 1) + ': ' + file.name + ' (' + plupload.formatSize(file.size) + ') <b></b></div>' );
count++;
});
var maxupload = <%= MaxVideoUploads %>;
if(count > maxupload)
{
$.each(files, function(i, file) {
uploader.removeFile(file);
});
$('#filelist').html("");
$("#uploadfiles").hide();
Display_Message("#<%= msg.ClientID %>", "Can't upload more than " + maxupload + " records at once!", 1, 1);
return false;
}
else {
$("#tfiles").html(count);
$("#uploadfiles").removeClass("disabled");
$("#<%= pickfiles.ClientID %>").hide();
}
up.refresh(); // Reposition Flash/Silverlight
});
uploader.bind('UploadProgress', function (up, file) {
$('#' + file.id + " b").html(file.percent + "%");
});
uploader.bind('Error', function (up, err) {
$('#filelist').append("<div>Error: " + err.code +
", Message: " + err.message +
(err.file ? ", File: " + err.file.name : "") +
"</div>"
);
up.refresh(); // Reposition Flash/Silverlight
});
var failedstatus = 0;
uploader.bind('FileUploaded', function (up, file, info) {
// encode started
if (info.response != "failed" && info.response != "") {
EncodeVD(file.id, info.response, file.size);
Display_Message('#' + file.id, "Please wait for final processing", 0, 1);
if (failedstatus == 0)
Redirect(info.response);
filesuploaded++;
}
else {
Display_Message('#' + file.id, "Response is: " + info.response, 0, 1);
}
});
});
var redcnt = 0;
function Redirect(filename) {
var IntervalID = setInterval(function () {
redcnt++;
if (redcnt > 2) {
clearInterval(IntervalID);
var tfiles = $("#tfiles").html();
if(tfiles == faileduploaded) { // break further processing all videos failed to upload
}
else if (filesuploaded >= tfiles) {
document.location = "<%=ConfirmPageUrl %>?fn=" + filename + "&gid=<%=GalleryID %>&uvids=" + $("#tfiles").html() + "&mpid=" + $("#maxpid").html().trim() + "<%=GroupParam %>";
}
}
}, 2000);
}
function EncodeVD(mid, mfile, msize) {
var params = '<%= EncodingParams %>&fn=' + mfile;
$.ajax({
type: 'GET',
url: '<%= Encoding_Handler_Path %>',
data: params,
async: true,
success: function (msg) {
if (msg == "Success" || msg == "") {
$('#' + mid).html('<strong>Uploading Completed Successfully - Wait for Processing.');
}
else {
failedstatus = 1;
faileduploaded++;
Display_Message('#' + mid, "Response is: " + msg, 0, 1);
}
}
});
}
</strong>Server side code for processing the file upload :
private int MediaType = 0; // 0 : video, 1: audio
public void ProcessRequest (HttpContext context) {
try
{
context.Response.ContentType = "text/plain";
context.Response.Write(ProcessMedia(context));
}
catch (Exception ex)
{
context.Response.Write("error|" + ex.Message);
}
}
public string ProcessMedia(HttpContext context)
{
if (context.Request.Files.Count > 0)
{
int chunk = context.Request["chunk"] != null ? int.Parse(context.Request["chunk"]) : 0;
string fileName = context.Request["name"] != null ? context.Request["name"] : string.Empty;
//string _fileName = fileName.Remove(fileName.LastIndexOf(".")) + "-" + Guid.NewGuid().ToString().Substring(0, 6) + "" + fileName.Remove(0, fileName.LastIndexOf("."));
HttpPostedFile fileUpload = context.Request.Files[0];
string upath = "";
if (context.Request.Headers["UName"] != null)
upath = context.Request.Headers["UName"].ToString();
//if (CloudSettings.EnableCloudStorage && upath != "")
// _fileName = upath.Substring(0, 3) + "-" + _fileName; // avoid duplication in cloud storage
if (context.Request.Headers["MTP"] != null)
MediaType = Convert.ToInt32(context.Request.Headers["MTP"]);
//string extensions = "";
//if (MediaType == 0)
// extensions = Site_Settings.Video_Allowed_Formats;
//else
// extensions = Site_Settings.Audio_Allowed_Formats;
//bool sts = UtilityBLL.Check_File_Extension(extensions, fileName.ToLower());
//if (sts == false)
//{
// return "Invalid format, please upload proper video!"; // Invalid video format, please upload proper video
//}
int allowable_size_mb = 0;
if (MediaType == 0)
{
allowable_size_mb = Site_Settings.Video_Max_Size;
}
else
{
allowable_size_mb = Site_Settings.Audio_Max_Size;
}
int UploadSize = allowable_size_mb * 1000000;
if (fileUpload.ContentLength > UploadSize)
{
return "Video Limit Exceeds";
}
string uploadPath = "";
// check whether audio / mp3 encoding enabled
if (this.MediaType == 1)
{
// audio encoding
if (fileName.EndsWith(".mp3"))
{
// upload mp3 directly in mp3 path instead of default path
if (upath == "")
uploadPath = UrlConfig.MP3_Path(); // source video path
else
uploadPath = UrlConfig.MP3_Path(upath); // source video path
}
else
{
// default path
if (upath == "")
uploadPath = UrlConfig.Source_Video_Path(); // source video path
else
uploadPath = UrlConfig.Source_Video_Path(upath); // source video path
}
}
else
{//azure
// default path
if (upath == "")
uploadPath = UrlConfig.Source_Video_Path(); // source video path
else
uploadPath = UrlConfig.Source_Video_Path(upath); // source video path
}
FileStream fs;
using (fs = new FileStream(Path.Combine(uploadPath, fileName), chunk == 0 ? FileMode.Create : FileMode.Append))
{
byte[] buffer = new byte[fileUpload.InputStream.Length];
fileUpload.InputStream.Read(buffer, 0, buffer.Length);
fs.Write(buffer, 0, buffer.Length);
}
return fileName; // "Success";
}
else
{
return "failed";
}
return "";
}
public bool IsReusable {
get {
return false;
}
}code in encode.aspx responsible for encoding the video :
private string EncodeMedia(HttpContext context)
{
string sourcepath = "";
string publishedpath = "";
string mp3path = "";
string thumbpath = "";
if (this.UserName != "")
{//azure
sourcepath = UrlConfig.Source_Video_Path(this.UserName);
publishedpath = UrlConfig.Published_Video_Path(this.UserName);
mp3path = UrlConfig.MP3_Path(this.UserName);
thumbpath = UrlConfig.Thumbs_Path(this.UserName);
}
else
{
sourcepath = UrlConfig.Source_Video_Path();
publishedpath = UrlConfig.Published_Video_Path();
mp3path = UrlConfig.MP3_Path();
thumbpath = UrlConfig.Thumbs_Path();
}
if (this.FileName.EndsWith(".mp3") && this.MediaType == 1)
{
// mp3 and audio format
if (!File.Exists(mp3path + "/" + this.FileName))
{
return "Audio file not found!";
}
}
else
{
// rest normal video and audio encoding
if (!File.Exists(sourcepath + "/" + this.FileName))
{
return "Source file not found!";
}
}
if (CloudSettings.EnableCloudStorage && this.UserName != "")
this.FileName = this.UserName.Substring(0, 3) + "-" + this.FileName; // avoid duplication in cloud storage
//double f_contentlength = 0;
//if (Site_Settings.Feature_Packages == 1)
//{
// if (Config.GetMembershipAccountUpgradeType() != 1)
// {
// // Check whether user have enough space to upload content
// // Restriction only for normal or premium users
// f_contentlength = (double)fileUpload.ContentLength / 1000000;
// string media_field_name = "space_video";
// if (MediaType == 1)
// media_field_name = "space_audio";
// if (!User_PackagesBLL.Check_User_Space_Status(upath, media_field_name, f_contentlength) && !isAdmin)
// {
// // insufficient credits to upload content
// return "Insufficient credits to upload media file"; // Response.Redirect(Config.GetUrl("myaccount/packages.aspx?status=" + media_field_name), true);
// }
// }
//}
this.backgroundpublishing = true; // should be true on direct encoding
// Video Processing
string flv_filename = "";
string original_filename = "";
string thumb_filename = "";
string duration = "";
int duration_sec = 0;
// set video actions : 1 -> on, 0 -> off
int isenabled = 1;
int ispublished = 1;
int isreviewed = 1;
int isresponse = 0;
if (Response_VideoID > 0)
isresponse = 1;
string flv_url = "none";
string thumb_url = "none";
string org_url = "none";
string _embed = "";
string errorcode = "0";
VideoInfo info = null;
if (Site_Settings.Content_Approval == 0)
isreviewed = 0;
// check whether audio / mp3 encoding enabled
if (this.FileName.EndsWith(".mp3") && this.MediaType==1)
{
// audio encoding
// mp3 file already
// so no encoding happens
MediaHandler _minfo = new MediaHandler();
_minfo.FFMPEGPath = Encoding_Settings.FFMPEGPATH;
_minfo.FileName = FileName;
_minfo.InputPath = mp3path;
info = _minfo.Get_Info();
flv_filename = FileName;
original_filename = FileName;
duration = info.Duration;
duration_sec = info.Duration_Sec;
isenabled = 1; // enabled
}
else if (this.directpublishing)
{
// publish video
ArrayList itags = new ArrayList();
MHPEncoder encoder = new MHPEncoder();
//if (this.FileName.EndsWith(".mpeg") || this.FileName.EndsWith(".mpg")) // use mpg compatible ffmpeg encoder
// encoder.FfmpegPath = HttpContext.Current.Server.MapPath(HttpContext.Current.Request.ApplicationPath) + "\\ffmpeg\\ffmpegbk\\ffmpeg.exe";
//encoder.ThumbFfmpegPath = Encoding_Settings.FFMPEGPATH; // use normal ffmpeg encoder for thumbs processing
//azure
encoder.FfmpegPath = Encoding_Settings.FFMPEGPATH; // use normal ffmpeg encoder
encoder.FlvToolPath = Encoding_Settings.FLVTOOLPATH; // set meta information for flv videos
encoder.Mp4BoxPath = Encoding_Settings.MP4BoxPath; // set meta information for mp4 videos
encoder.SourcePath = sourcepath;
encoder.SourceFileName = this.FileName;
// No cloud storage on direct encoding
//if (CloudSettings.EnableCloudStorage)
// encoder.EnableCloudStorage = true;
if (MediaType == 1)
{
// audio encoding
itags.Add("14");
encoder.iTags = itags;
encoder.GrabThumbs = false;
encoder.PublishedPath = mp3path;
//_vprocess.OutputPath = this.MP3Path;
//_vprocess.isAudio = true;
}
else
{
// video encoding
itags.Add(EncoderSettings.DefaultITagValue.ToString()); // 5 for 360p mp4 encoding
//itags.Add(7); // this will call 7 case settings to publish next video ending with _7.mp4 instead of _5.mp4
// so there will be 2 videos with different resoultions published at the end of the process?
// yesmake sure use proper settings first test it directly via command.
//okay i got it. But i'm gonna have to use a different media players to incroporate those settings
// once published you can load different videos for different user by checking _7.mp4 (end) va
//okay got it.
//azure
encoder.PublishedPath = publishedpath;
encoder.iTags = itags;
encoder.ThumbsDirectory = thumbpath;
encoder.TotalThumbs = 15;
//_vprocess.ThumbPath = this.ThumbPath;
//_vprocess.OutputPath = this.FLVPath;
//if (Config.isPostWaterMark())
//{
// // script for posting watermark on video
// _vprocess.WaterMarkPath = Server.MapPath(Request.ApplicationPath) + "\\contents\\watermark";
// _vprocess.WaterMarkImage = "watermark.gif";
//}
}
int deleteoption = Site_Settings.Video_Delete_Original;
if (deleteoption == 1)
{
encoder.DeleteSource = true;
}
// background processing
if (this.backgroundpublishing && this.MediaType==0)
{
encoder.BackgroundProcessing = true;
// get information from source video in order to store it in database
MediaHandler _minfo = new MediaHandler();
//if (this.FileName.EndsWith(".mpeg") || this.FileName.EndsWith(".mpg")) // use mpg compatible ffmpeg encoder
// encoder.FfmpegPath = HttpContext.Current.Server.MapPath(HttpContext.Current.Request.ApplicationPath) + "\\ffmpeg\\ffmpegbk\\ffmpeg.exe";
//else
_minfo.FFMPEGPath = Encoding_Settings.FFMPEGPATH;
_minfo.FileName = FileName;
_minfo.InputPath = sourcepath;
info = _minfo.Get_Info();
}
// encode video processing
Video_Information vinfo = encoder.Process();
if (vinfo.ErrorCode > 0)
{
errorcode = vinfo.ErrorCode.ToString();
ErrorLgBLL.Add_Log("Encoding Failed Log", "", "encoding error: " + vinfo.ErrorCode.ToString() + "<br />Description: " + vinfo.ErrorDescription.ToString());
//return vinfo.ErrorDescription;
}
// Double check validation
// if published video exist
// if thumb exist
// then proceed further
if (MediaType == 0)
{
if (!File.Exists(encoder.PublishedPath + "/" + vinfo.FLVVideoName))
{
return "Video failed to published properly.";
}
if (!File.Exists(encoder.ThumbsDirectory + "/" + vinfo.ThumbFileName))
{
return "Thumbs failed to grab from video properly.";
}
}
else
{
if (vinfo.FLVVideoName == "")
{
vinfo.FLVVideoName = this.FileName.Remove(this.FileName.LastIndexOf(".")) + "_14.mp3"; // mp3 file path name
}
if (!File.Exists(encoder.PublishedPath + "/" + vinfo.FLVVideoName))
{
return "Audio failed to published properly.";
}
}
// Now thumbs and video published, procceed for data processing
// get information from vinfo object
if (this.backgroundpublishing && this.MediaType == 0)
{
string OutputFileName = this.FileName.Remove(this.FileName.LastIndexOf("."));
flv_filename = OutputFileName + "_" + EncoderSettings.DefaultITagValue + "." + EncoderSettings.Return_Output_Extension(EncoderSettings.DefaultITagValue);
original_filename = vinfo.OriginalVideoName;
thumb_filename = OutputFileName + "_008.jpg"; // info.ThumbFileName;
duration = info.Duration;
duration_sec = info.Duration_Sec;
}
else
{
flv_filename = vinfo.FLVVideoName;
original_filename = vinfo.OriginalVideoName;
thumb_filename = vinfo.ThumbFileName;
duration = vinfo.Duration;
duration_sec = vinfo.Duration_Sec;
isenabled = vinfo.isEnabled;
}
// No cloud storage on direct encoding.
// Note cloude storage only works if background processing is disabled
// Or works in cased of sheduled processing
if (CloudSettings.EnableCloudStorage && errorcode == "0")
{
flv_url = "amazon";
org_url = "https://s3.amazonaws.com/" + CloudSettings.OriginalVideoBucketName + "/" + this.FileName;
thumb_url = "https://s3.amazonaws.com/" + CloudSettings.ThumbsBucketName + "/" + thumb_filename;
}
}
else
{
// set publishing status off.
ispublished = 0;
original_filename = this.FileName;
}
// Store video information in database
string ipaddress = context.Request.ServerVariables["REMOTE_ADDR"].ToString();
// Store media information in database
Video_Struct vd = new Video_Struct();
vd.CategoryID = 0; // store categoryname or term instead of category id
vd.Categories = Categories;
vd.UserName = UserName;
vd.Title = "";
vd.Description = "";
vd.Tags = Tags;
vd.Duration = duration;
vd.Duration_Sec = duration_sec;
vd.OriginalVideoFileName = original_filename;
vd.VideoFileName = flv_filename;
vd.ThumbFileName = thumb_filename;
vd.isPrivate = Privacy;
vd.AuthKey = PAuth;
vd.isEnabled = isenabled;
vd.Response_VideoID = Response_VideoID; // video responses
vd.isResponse = isresponse;
vd.isPublished = ispublished;
vd.isReviewed = isreviewed;
vd.FLV_Url = flv_url;
vd.Thumb_Url = thumb_url;
vd.Org_Url = org_url;
vd.Embed_Script = _embed;
vd.isExternal = 0; // website own video, 1: embed video
vd.IPAddress = ipaddress;
vd.Type = MediaType;
vd.YoutubeID = "";
vd.isTagsreViewed = 1;
vd.Mode = 0; // filter videos based on website sections
//vd.ContentLength = f_contentlength;
vd.GalleryID = GID;
vd.ErrorCode = Convert.ToInt32(errorcode);
long videoid = VideoBLL.Process_Info(vd, false);
// Process tags
if (Tags != "")
{
int tag_type = 0; // represent videos
if (MediaType == 1)
tag_type = 4; // represent audio file
TagsBLL.Process_Tags(Tags, tag_type, 0);
}
if (Response_VideoID > 0)
{
VideoBLL.Update_Responses(Response_VideoID);
}
return "Success";
} -
-
Monster Battery Power Revisited
28 mai 2010, par Multimedia Mike — Python, Science ProjectsSo I have this new fat netbook battery and I performed an experiment to determine how long it really lasts. In my last post on the matter, it was suggested that I should rely on the information that gnome-power-manager is giving me. However, I have rarely seen GPM report more than about 2 hours of charge ; even on a full battery, it only reports 3h25m when I profiled it as lasting over 5 hours in my typical use. So I started digging to understand how GPM gets its numbers and determine if, perhaps, it’s not getting accurate data from the system.
I started poking around /proc for the data I wanted. You can learn a lot in /proc as long as you know the right question to ask. I had to remember what the power subsystem is called — ACPI — and this led me to /proc/acpi/battery/BAT0/state which has data such as :
present : yes capacity state : ok charging state : charged present rate : unknown remaining capacity : 100 mAh present voltage : 8326 mV
"Remaining capacity" rated in mAh is a little odd ; I would later determine that this should actually be expressed as a percentage (i.e., 100% charge at the time of this reading). Examining the GPM source code, it seems to determine as a function of the current CPU load (queried via /proc/stat) and the battery state queried via a facility called devicekit. I couldn’t immediately find any source code to the latter but I was able to install a utility called ’devkit-power’. Mostly, it appears to rehash data already found in the above /proc file.
Curiously, the file /proc/acpi/battery/BAT0/info, which displays essential information about the battery, reports the design capacity of my battery as only 4400 mAh which is true for the original battery ; the new monster battery is supposed to be 10400 mAh. I can imagine that all of these data points could be conspiring to under-report my remaining battery life.
Science project : Repeat the previous power-related science project but also parse and track the remaining capacity and present voltage fields from the battery state proc file.
Let’s skip straight to the results (which are consistent with my last set of results in terms of longevity) :
So there is definitely something strange going on with the reporting— the 4400 mAh battery reports discharge at a linear rate while the 10400 mAh battery reports precipitous dropoff after 60%.
Another curious item is that my script broke at first when there was 20% power remaining which, as you can imagine, is a really annoying time to discover such a bug. At that point, the "time to empty" reported by devkit-power jumped from 0 seconds to 20 hours (the first state change observed for that field).
Here’s my script, this time elevated from Bash script to Python. It requires xdotool and devkit-power to be installed (both should be available in the package manager for a distro).
PYTHON :-
# !/usr/bin/python
-
-
import commands
-
import random
-
import sys
-
import time
-
-
XDOTOOL = "/usr/bin/xdotool"
-
BATTERY_STATE = "/proc/acpi/battery/BAT0/state"
-
DEVKIT_POWER = "/usr/bin/devkit-power -i /org/freedesktop/DeviceKit/Power/devices/battery_BAT0"
-
-
print "count, unixtime, proc_remaining_capacity, proc_present_voltage, devkit_percentage, devkit_voltage"
-
-
count = 0
-
while 1 :
-
commands.getstatusoutput("%s mousemove %d %d" % (XDOTOOL, random.randrange(0,800), random.randrange(0, 480)))
-
battery_state = open(BATTERY_STATE).read().splitlines()
-
for line in battery_state :
-
if line.startswith("remaining capacity :") :
-
proc_remaining_capacity = int(line.lstrip("remaining capacity : ").rstrip("mAh"))
-
elif line.startswith("present voltage :") :
-
proc_present_voltage = int(line.lstrip("present voltage : ").rstrip("mV"))
-
devkit_state = commands.getoutput(DEVKIT_POWER).splitlines()
-
for line in devkit_state :
-
line = line.strip()
-
if line.startswith("percentage :") :
-
devkit_percentage = int(line.lstrip("percentage :").rstrip(’\%’))
-
elif line.startswith("voltage :") :
-
devkit_voltage = float(line.lstrip("voltage :").rstrip(’V’)) * 1000
-
print "%d, %d, %d, %d, %d, %d" % (count, time.time(), proc_remaining_capacity, proc_present_voltage, devkit_percentage, devkit_voltage)
-
sys.stdout.flush()
-
time.sleep(60)
-
count += 1
-
-
Anatomy of an optimization : H.264 deblocking
As mentioned in the previous post, H.264 has an adaptive deblocking filter. But what exactly does that mean — and more importantly, what does it mean for performance ? And how can we make it as fast as possible ? In this post I’ll try to answer these questions, particularly in relation to my recent deblocking optimizations in x264.
H.264′s deblocking filter has two steps : strength calculation and the actual filter. The first step calculates the parameters for the second step. The filter runs on all the edges in each macroblock. That’s 4 vertical edges of length 16 pixels and 4 horizontal edges of length 16 pixels. The vertical edges are filtered first, from left to right, then the horizontal edges, from top to bottom (order matters !). The leftmost edge is the one between the current macroblock and the left macroblock, while the topmost edge is the one between the current macroblock and the top macroblock.
Here’s the formula for the strength calculation in progressive mode. The highest strength that applies is always selected.
If we’re on the edge between an intra macroblock and any other macroblock : Strength 4
If we’re on an internal edge of an intra macroblock : Strength 3
If either side of a 4-pixel-long edge has residual data : Strength 2
If the motion vectors on opposite sides of a 4-pixel-long edge are at least a pixel apart (in either x or y direction) or the reference frames aren’t the same : Strength 1
Otherwise : Strength 0 (no deblocking)These values are then thrown into a lookup table depending on the quantizer : higher quantizers have stronger deblocking. Then the actual filter is run with the appropriate parameters. Note that Strength 4 is actually a special deblocking mode that performs a much stronger filter and affects more pixels.
One can see somewhat intuitively why these strengths are chosen. The deblocker exists to get rid of sharp edges caused by the block-based nature of H.264, and so the strength depends on what exists that might cause such sharp edges. The strength calculation is a way to use existing data from the video stream to make better decisions during the deblocking process, improving compression and quality.
Both the strength calculation and the actual filter (not described here) are very complex if naively implemented. The latter can be SIMD’d with not too much difficulty ; no H.264 decoder can get away with reasonable performance without such a thing. But what about optimizing the strength calculation ? A quick analysis shows that this can be beneficial as well.
Since we have to check both horizontal and vertical edges, we have to check up to 32 pairs of coefficient counts (for residual), 16 pairs of reference frame indices, and 128 motion vector values (counting x and y as separate values). This is a lot of calculation ; a naive implementation can take 500-1000 clock cycles on a modern CPU. Of course, there’s a lot of shortcuts we can take. Here’s some examples :
- If the macroblock uses the 8×8 transform, we only need to check 2 edges in each direction instead of 4, because we don’t deblock inside of the 8×8 blocks.
- If the macroblock is a P-skip, we only have to check the first edge in each direction, since there’s guaranteed to be no motion vector differences, reference frame differences, or residual inside of the macroblock.
- If the macroblock has no residual at all, we can skip that check.
- If we know the partition type of the macroblock, we can do motion vector checks only along the edges of the partitions.
- If the effective quantizer is so low that no deblocking would be performed no matter what, don’t bother calculating the strength.
But even all of this doesn’t save us from ourselves. We still have to iterate over a ton of edges, checking each one. Stuff like the partition-checking logic greatly complicates the code and adds overhead even as it reduces the number of checks. And in many cases decoupling the checks to add such logic will make it slower : if the checks are coupled, we can avoid doing a motion vector check if there’s residual, since Strength 2 overrides Strength 1.
But wait. What if we could do this in SIMD, just like the actual loopfilter itself ? Sure, it seems more of a problem for C code than assembly, but there aren’t any obvious things in the way. Many years ago, Loren Merritt (pengvado) wrote the first SIMD implementation that I know of (for ffmpeg’s decoder) ; it is quite fast, so I decided to work on porting the idea to x264 to see if we could eke out a bit more speed here as well.
Before I go over what I had to do to make this change, let me first describe how deblocking is implemented in x264. Since the filter is a loopfilter, it acts “in loop” and must be done in both the encoder and decoder — hence why x264 has it too, not just decoders. At the end of encoding one row of macroblocks, x264 goes back and deblocks the row, then performs half-pixel interpolation for use in encoding the next frame.
We do it per-row for reasons of cache coherency : deblocking accesses a lot of pixels and a lot of code that wouldn’t otherwise be used, so it’s more efficient to do it in a single pass as opposed to deblocking each macroblock immediately after encoding. Then half-pixel interpolation can immediately re-use the resulting data.
Now to the change. First, I modified deblocking to implement a subset of the macroblock_cache_load function : spend an extra bit of effort loading the necessary data into a data structure which is much simpler to address — as an assembly implementation would need (x264_macroblock_cache_load_deblock). Then I massively cleaned up deblocking to move all of the core strength-calculation logic into a single, small function that could be converted to assembly (deblock_strength_c). Finally, I wrote the assembly functions and worked with Loren to optimize them. Here’s the result.
And the timings for the resulting assembly function on my Core i7, in cycles :
deblock_strength_c : 309
deblock_strength_mmx : 79
deblock_strength_sse2 : 37
deblock_strength_ssse3 : 33Now that is a seriously nice improvement. 33 cycles on average to perform that many comparisons–that’s absurdly low, especially considering the SIMD takes no branchy shortcuts : it always checks every single edge ! I walked over to my performance chart and happily crossed off a box.
But I had a hunch that I could do better. Remember, as mentioned earlier, we’re reloading all that data back into our data structures in order to address it. This isn’t that slow, but takes enough time to significantly cut down on the gain of the assembly code. And worse, less than a row ago, all this data was in the correct place to be used (when we just finished encoding the macroblock) ! But if we did the deblocking right after encoding each macroblock, the cache issues would make it too slow to be worth it (yes, I tested this). So I went back to other things, a bit annoyed that I couldn’t get the full benefit of the changes.
Then, yesterday, I was talking with Pascal, a former Xvid dev and current video hacker over at Google, about various possible x264 optimizations. He had seen my deblocking changes and we discussed that a bit as well. Then two lines hit me like a pile of bricks :
<_skal_> tried computing the strength at least ?
<_skal_> while it’s freshWhy hadn’t I thought of that ? Do the strength calculation immediately after encoding each macroblock, save the result, and then go pick it up later for the main deblocking filter. Then we can use the data right there and then for strength calculation, but we don’t have to do the whole deblock process until later.
I went and implemented it and, after working my way through a horde of bugs, eventually got a working implementation. A big catch was that of slices : deblocking normally acts between slices even though normal encoding does not, so I had to perform extra munging to get that to work. By midday today I was able to go cross yet another box off on the performance chart. And now it’s committed.
Sometimes chatting for 10 minutes with another developer is enough to spot the idea that your brain somehow managed to miss for nearly a straight week.
NB : the performance chart is on a specific test clip at a specific set of settings (super fast settings) relevant to the company I work at, so it isn’t accurate nor complete for, say, default settings.
Update : Here’s a higher resolution version of the current chart, as requested in the comments.