
Recherche avancée
Médias (1)
-
Revolution of Open-source and film making towards open film making
6 octobre 2011, par
Mis à jour : Juillet 2013
Langue : English
Type : Texte
Autres articles (48)
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (9553)
-
using ffmpeg as multiple downloads
26 avril 2013, par user2321982well guys, I'm having problems with my application.
well I'll explain more, I'm using the plugin uploader uploads multiple videos and I'm using the ffmpeg extension to my php version.<code class="echappe-js"><script type="text/javascript"><br />
// Convert divs to queue widgets when the DOM is ready<br />
$(function() {<br />
$("#uploader").pluploadQueue({<br />
// General settings<br />
runtimes : &#39;gears,flash,silverlight,browserplus,html5&#39;,<br />
url : &#39;upload.php?id=&lt;?= $login_usuario; ?>&amp;name=&lt;?= $name_user; ?>&#39;,<br />
flash_swf_url : &#39;plugins/plupload/plupload.flash.swf&#39;,<br />
containers: &#39;plupload&#39;,<br />
multipart: true,<br />
urlstream_upload: true,<br />
multipart_params:{directory:&#39;uploads&#39;},<br />
multi_selection: true,<br />
<br />
max_file_size : &#39;500mb&#39;,<br />
chunk_size : &#39;1mb&#39;,<br />
unique_names : true,<br />
<br />
// Specify what files to browse for<br />
filters : [<br />
{title : "Video Files", extensions : "avi,mpg,wmv,flv,3gp,mpeg,mpeg4,mpg4,mp4"}<br />
],<br />
// Silverlight settings<br />
silverlight_xap_url : &#39;plugins/plupload/plupload.silverlight.xap&#39;<br />
});<br />
<br />
uploader.ini();<br />
<br />
// Client side form validation<br />
$(&#39;form&#39;).submit(function(e) {<br />
var uploader = $(&#39;#uploader&#39;).plupload(&#39;getUploader&#39;);<br />
<br />
// Files in queue upload them first<br />
if (uploader.files.length > 0) {<br />
// When all files are uploaded submit form<br />
uploader.bind(&#39;StateChanged&#39;, function() {<br />
if (uploader.files.length === (uploader.total.uploaded + uploader.total.failed)) {<br />
$(&#39;form&#39;)[0].submit();<br />
}<br />
});<br />
<br />
uploader.start();<br />
} else<br />
alert(&#39;You must at least upload one file.&#39;);<br />
<br />
return false;<br />
});<br />
});<br />
</script>You browser doesn't have Flash, Silverlight, Gears, BrowserPlus or HTML5 support.
well he calls javascript in a php file, in case it will upload the videos.
what I want is to get the length, resolution, file size and take a screenshot of the video randomly, so far so good, the problem is that it only works with only one upload at a time.
When sending more than one he does the procedure with ffmpeg, besides the most is not taking a screenshot of the video.
below is my code<?php
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
if (!extension_loaded ('ffmpeg') ) exit ( 'ffmpeg não foi carregado!' );
// HTTP headers for no cache etc
header("Expires: Mon, 26 Jul 1997 05:00:00 GMT");
header("Last-Modified: " . gmdate("D, d M Y H:i:s") . " GMT");
header("Cache-Control: no-store, no-cache, must-revalidate");
header("Cache-Control: post-check=0, pre-check=0", false);
header("Pragma: no-cache");
include("includes/seguranca.php");
include("includes/classes/class.mysql.php");
include("includes/funcoes/sql_inject.php");
$id_user = anti_injection($_GET['id']);
session_start();
// Settings
//$targetDir = ini_get("upload_tmp_dir") . DIRECTORY_SEPARATOR . "plupload";
$targetDir = 'uploads/';
//$cleanupTargetDir = false; // Remove old files
//$maxFileAge = 60 * 60; // Temp file age in seconds
// 5 minutes execution time
@set_time_limit(5 * 60);
// Uncomment this one to fake upload time
// usleep(5000);
// Get parameters
$chunk = isset($_REQUEST["chunk"]) ? $_REQUEST["chunk"] : 0;
$chunks = isset($_REQUEST["chunks"]) ? $_REQUEST["chunks"] : 0;
$fileName = isset($_REQUEST["name"]) ? $_REQUEST["name"] : '';
// Clean the fileName for security reasons
$fileName = preg_replace('/[^\w\._]+/', '', $fileName);
// Make sure the fileName is unique but only if chunking is disabled
if ($chunks < 2 && file_exists($targetDir . DIRECTORY_SEPARATOR . $fileName)) {
$ext = strrpos($fileName, '.');
$fileName_a = substr($fileName, 0, $ext);
$fileName_b = substr($fileName, $ext);
$count = 1;
while (file_exists($targetDir . DIRECTORY_SEPARATOR . $fileName_a . '_' . $count . $fileName_b))
$count++;
$fileName = $fileName_a . '_' . $count . $fileName_b;
}
// Create target dir
if (!file_exists($targetDir))
@mkdir($targetDir);
// Look for the content type header
if (isset($_SERVER["HTTP_CONTENT_TYPE"]))
$contentType = $_SERVER["HTTP_CONTENT_TYPE"];
if (isset($_SERVER["CONTENT_TYPE"]))
$contentType = $_SERVER["CONTENT_TYPE"];
// Handle non multipart uploads older WebKit versions didn't support multipart in HTML5
if (strpos($contentType, "multipart") !== false) {
if (isset($_FILES['file']['tmp_name']) && is_uploaded_file($_FILES['file']['tmp_name'])) {
// Open temp file
$out = fopen($targetDir . DIRECTORY_SEPARATOR . $fileName, $chunk == 0 ? "wb" : "ab");
if ($out) {
// Read binary input stream and append it to temp file
$in = fopen($_FILES['file']['tmp_name'], "rb");
if ($in) {
while ($buff = fread($in, 4096))
fwrite($out, $buff);
} else
die('{"jsonrpc" : "2.0", "error" : {"code": 101, "message": "Failed to open input stream."}, "id" : "id"}');
fclose($in);
fclose($out);
@unlink($_FILES['file']['tmp_name']);
} else
die('{"jsonrpc" : "2.0", "error" : {"code": 102, "message": "Failed to open output stream."}, "id" : "id"}');
} else
die('{"jsonrpc" : "2.0", "error" : {"code": 103, "message": "Failed to move uploaded file."}, "id" : "id"}');
} else {
// Open temp file
$out = fopen($targetDir . DIRECTORY_SEPARATOR . $fileName, $chunk == 0 ? "wb" : "ab");
if ($out) {
// Read binary input stream and append it to temp file
$in = fopen("php://input", "rb");
if ($in) {
while ($buff = fread($in, 4096))
fwrite($out, $buff);
} else
die('{"jsonrpc" : "2.0", "error" : {"code": 101, "message": "Failed to open input stream."}, "id" : "id"}');
fclose($in);
fclose($out);
} else
die('{"jsonrpc" : "2.0", "error" : {"code": 102, "message": "Failed to open output stream."}, "id" : "id"}');
}
$photoName = $_FILES['file']['name'];
$exploded_photoName = explode('.', $photoName);
$user_id = $_SESSION['SESS_MEMBER_ID'];
if(!isset($_SESSION[$photoName])) {
$_SESSION[$photoName] = '1';
}
if($chunk==1){
$movie_file = realpath("/uploads/".$fileName);
// instancia a classe ffmpeg_movie para pegarmos as informações que queremos o vídeo
$movie = new ffmpeg_movie($movie_file);
// pegamos a duranção do video em segundos
$duration = round ( $movie->getDuration() , 0 );
// recebemos o número de frames do vídeo
$totalFrames = $movie->getFrameCount();
// recebemos a altura do vídeo em pixels
$height = $movie->getFrameHeight ();
// recebemos a largura do vídeo em pixels
$width = $movie->getFrameWidth ();
$thumbnailOf = $movie->getFrameRate() * 5;
$thumbnailOf = round ( $movie->getFrameCount() / 2 );
// precisamos criar uma imagem GD para o ffmpeg-php trabalhar nela
$image = imagecreatetruecolor ($width,$height ) ;
// criamos a instancia do frame com a classe ffmpeg_frame
$frame = new ffmpeg_frame ($image);
// escolhemos o frame que queremos salvar como jpeg
$thumbnailOf = rand (1,$movie->getFrameCount());
// recebe o frame
$frame = $movie->getFrame ( $thumbnailOf );
// converte para uma imagem GD
$image = $frame->toGDImage ();
$size = filesize("uploads/".$fileName);
$valuea = rand(0,100);
$valueb = rand(0,100);
$valuec = rand(0,10);
$nome = @md5(date('now') * $valuea - $valueb / $valuec);
$dir = realpath("uploads/snapshots/$nome.jpg");
//salva no HD.
imagejpeg($image, $dir, 100);
$resolu = $width."x".$height;
mysql_query("INSERT INTO video
(poster_id,duration,video,titulo,resolucao,tamanho)
VALUES
('$id_user','$duration','$fileName','$exploded_photoName[0]','$resolu','$size')") or die(mysql_error());
}
// Return JSON-RPC response
die('{"jsonrpc" : "2.0", "result" : null, "id" : "id"}');
?>I did another method not so effective to do what I want but it only works if I open the page individually.
I removed the code from ffmpeg extension and put an include to that page.
<?
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
if (!extension_loaded ('ffmpeg') ) exit ( 'ffmpeg não foi carregado!' );
include("includes/seguranca.php");
include("includes/classes/class.mysql.php");
include("includes/funcoes/sql_inject.php");
$id_user = @anti_injection($_GET['id']);
class process{
public $videos;
public $target;
public $id_vid;
public function screen(){
$videos = $this->videos;
$target = $this->target;
$movie_file = realpath($videos);
// instancia a classe ffmpeg_movie para pegarmos as informações que queremos o vídeo
$movie = new ffmpeg_movie($movie_file);
$totalFrames = $movie->getFrameCount();
$thumbnailOf = $movie->getFrameRate() * 5;
$thumbnailOf = round ( $movie->getFrameCount() / 2 );
// criamos a instancia do frame com a classe ffmpeg_frame
$frame = new ffmpeg_frame ( $image );
// escolhemos o frame que queremos salvar como jpeg
$thumbnailOf = rand (1,$movie->getFrameCount());
// recebe o frame
$frame = $movie->getFrame ( $thumbnailOf );
// converte para uma imagem GD
$image = $frame->toGDImage ();
$valuea = rand(0,100);
$valueb = rand(0,100);
$valuec = rand(0,10);
$nome = @md5(date('now') * $valuea - $valueb / $valuec);
$dir = $target."snapshots/".$nome.".jpeg";
//salva no HD.
imagejpeg($image,$dir, 100);
$q = new Query;
$q ->update('video')
->set(
array(
'screen'=>$nome.".jpeg"
)
)
->where_equal_to(
array(
'id_video'=>$this->id_vid
)
)
->limit(1)
->run();
}
public function duration(){
$videos = $this->videos;
$movie_file = realpath($videos);
// instancia a classe ffmpeg_movie para pegarmos as informações que queremos o vídeo
$movie = new ffmpeg_movie($movie_file);
// pegamos a duranção do video em segundos
$duration = round ( $movie->getDuration() , 0 );
$q = new Query;
$q ->update('video')
->set(
array(
'duration'=>$duration
)
)
->where_equal_to(
array(
'id_video'=>$this->id_vid
)
)
->limit(1)
->run();
}
public function resolution(){
$videos = $this->videos;
$movie_file = realpath($videos);
// instancia a classe ffmpeg_movie para pegarmos as informações que queremos o vídeo
$movie = new ffmpeg_movie($movie_file);
// recebemos a altura do vídeo em pixels
$height = $movie->getFrameHeight ();
// recebemos a largura do vídeo em pixels
$width = $movie->getFrameWidth ();
$resolucao = $width."x".$height;
$q = new Query;
$q ->update('video')
->set(
array(
'resolucao'=>$resolucao
)
)
->where_equal_to(
array(
'id_video'=>$this->id_vid
)
)
->limit(1)
->run();
}
final public function checar($a){
$q = new Query;
$q ->update('video')
->set(
array(
'process'=>$a
)
)
->where_equal_to(
array(
'id_video'=>$this->id_vid
)
)
->limit(1)
->run();
}
}
$q=new Query;
$q->select(
array(
'id_video',
'video',
'dir',
'screen',
'poster_id',
'resolucao',
'duration'
)
)
->from('video')
->run();
if($q){
$users=$q->get_selected();
foreach($users as $user){
$c = new process;
$c->videos = $user['video'];
$c->target = $user['dir'];
$c->id_vid = $user['id_video'];
if(($user['screen'] == 'NULL') or ($user['duration'] == 'NULL') or ($user['resolucao'] == 'NULL')){
$c->checar(0);
if($user['screen'] == 'NULL'){
$c->screen();
}
if($user['duration'] == 'NULL'){
$c->duration();
}
if($user['resolucao'] == 'NULL'){
$c->resolution();
}
}else{
$c->checar(1);
}
}
}
else{
echo 'Sorry, no users found.';
}
?> -
Unable to stream file onto localhost - ffmpeg
18 octobre 2013, par trueblueI am new to ffmpeg/ffserver. I am trying to stream a local file named Trial onto a localhost using ffserver. I want to run the file in browser as
http://localhost:8090/feed1.ffm
I am executing the below command in Ubuntu(Trial is a Mpeg TS file) :ffmpeg -i Trial http://localhost:8090/feed1.ffm
Upon execution of above command I am getting below error :
FFmpeg version SVN-r0.5.9-4:0.5.9-0ubuntu0.10.04.3, Copyright (c) 2000-2009 Fabrice Bellard, et al.
configuration: --extra-version=4:0.5.9-0ubuntu0.10.04.3 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libgsm --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --disable-stripping --disable-vhook --enable-runtime-cpudetect --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libdc1394 --enable-shared --disable-static
libavutil 49.15. 0 / 49.15. 0
libavcodec 52.20. 1 / 52.20. 1
libavformat 52.31. 0 / 52.31. 0
libavdevice 52. 1. 0 / 52. 1. 0
libavfilter 0. 4. 0 / 0. 4. 0
libswscale 0. 7. 1 / 0. 7. 1
libpostproc 51. 2. 0 / 51. 2. 0
built on Jan 24 2013 19:42:59, gcc: 4.4.3
Seems stream 0 codec frame rate differs from container frame rate: 119.88 (120000/1001) -> 59.94 (60000/1001)
Input #0, mpegts, from 'Trial':
Duration: 00:00:04.22, start: 0.177633, bitrate: 40368 kb/s
Program 2
Stream #0.0[0x21]: Video: mpeg2video, yuv420p, 1280x720 [PAR 1:1 DAR 16:9], 45000 kb/s, 59.94 tbr, 90k tbn, 119.88 tbc
Output #0, ffm, to 'http://localhost:8090/feed1.ffm':
Stream #0.0: Video: flv, yuv420p, 352x288, q=1-5, 100 kb/s, 1000k tbn, 15 tbc
Stream #0.1: Audio: mp2, 44100 Hz, mono, s16, 32 kb/s
Stream #0.2: Video: mpeg1video, yuv420p, 160x128, q=3-31, 64 kb/s, 1000k tbn, 3 tbc
Stream #0.3: Audio: mp2, 22050 Hz, mono, s16, 64 kb/s
Stream #0.4: Video: msmpeg4, yuv420p, 352x240, q=3-31, 256 kb/s, 1000k tbn, 15 tbc
Could not find input stream matching output stream #0.1My ffserver.conf file goes like this :
# Port on which the server is listening. You must select a different
# port from your standard HTTP web server if it is running on the same
# computer.
Port 8090
# Address on which the server is bound. Only useful if you have
# several network interfaces.
BindAddress 0.0.0.0
# Number of simultaneous HTTP connections that can be handled. It has
# to be defined *before* the MaxClients parameter, since it defines the
# MaxClients maximum limit.
MaxHTTPConnections 2000
# Number of simultaneous requests that can be handled. Since FFServer
# is very fast, it is more likely that you will want to leave this high
# and use MaxBandwidth, below.
MaxClients 1000
# This the maximum amount of kbit/sec that you are prepared to
# consume when streaming to clients.
MaxBandwidth 1000
# Access log file (uses standard Apache log file format)
# '-' is the standard output.
CustomLog -
# Suppress that if you want to launch ffserver as a daemon.
NoDaemon
##################################################################
# Definition of the live feeds. Each live feed contains one video
# and/or audio sequence coming from an ffmpeg encoder or another
# ffserver. This sequence may be encoded simultaneously with several
# codecs at several resolutions.
<feed>
# You must use 'ffmpeg' to send a live feed to ffserver. In this
# example, you can type:
#
# ffmpeg http://localhost:8090/feed1.ffm
# ffserver can also do time shifting. It means that it can stream any
# previously recorded live stream. The request should contain:
# "http://xxxx?date=[YYYY-MM-DDT][[HH:]MM:]SS[.m...]".You must specify
# a path where the feed is stored on disk. You also specify the
# maximum size of the feed, where zero means unlimited. Default:
# File=/tmp/feed_name.ffm FileMaxSize=5M
File /tmp/feed1.ffm
FileMaxSize 5M
# You could specify
# ReadOnlyFile /saved/specialvideo.ffm
# This marks the file as readonly and it will not be deleted or updated.
# Specify launch in order to start ffmpeg automatically.
# First ffmpeg must be defined with an appropriate path if needed,
# after that options can follow, but avoid adding the http:// field
#Launch ffmpeg
# Only allow connections from localhost to the feed.
ACL allow 127.0.0.1
</feed>
<stream>
Feed feed1.ffm
Format swf
VideoCodec flv
VideoFrameRate 15
VideoBufferSize 80000
VideoBitRate 100
VideoQMin 1
VideoQMax 5
VideoSize 352x288
PreRoll 0
Noaudio
</stream>
##################################################################
# Now you can define each stream which will be generated from the
# original audio and video stream. Each format has a filename (here
# 'test1.mpg'). FFServer will send this stream when answering a
# request containing this filename.
<stream>
# coming from live feed 'feed1'
Feed feed1.ffm
# Format of the stream : you can choose among:
# mpeg : MPEG-1 multiplexed video and audio
# mpegvideo : only MPEG-1 video
# mp2 : MPEG-2 audio (use AudioCodec to select layer 2 and 3 codec)
# ogg : Ogg format (Vorbis audio codec)
# rm : RealNetworks-compatible stream. Multiplexed audio and video.
# ra : RealNetworks-compatible stream. Audio only.
# mpjpeg : Multipart JPEG (works with Netscape without any plugin)
# jpeg : Generate a single JPEG image.
# asf : ASF compatible streaming (Windows Media Player format).
# swf : Macromedia Flash compatible stream
# avi : AVI format (MPEG-4 video, MPEG audio sound)
Format mpeg
# Bitrate for the audio stream. Codecs usually support only a few
# different bitrates.
AudioBitRate 32
# Number of audio channels: 1 = mono, 2 = stereo
AudioChannels 1
# Sampling frequency for audio. When using low bitrates, you should
# lower this frequency to 22050 or 11025. The supported frequencies
# depend on the selected audio codec.
AudioSampleRate 44100
# Bitrate for the video stream
VideoBitRate 64
# Ratecontrol buffer size
VideoBufferSize 40
# Number of frames per second
VideoFrameRate 3
# Size of the video frame: WxH (default: 160x128)
# The following abbreviations are defined: sqcif, qcif, cif, 4cif, qqvga,
# qvga, vga, svga, xga, uxga, qxga, sxga, qsxga, hsxga, wvga, wxga, wsxga,
# wuxga, woxga, wqsxga, wquxga, whsxga, whuxga, cga, ega, hd480, hd720,
# hd1080
VideoSize 160x128
# Transmit only intra frames (useful for low bitrates, but kills frame rate).
#VideoIntraOnly
# If non-intra only, an intra frame is transmitted every VideoGopSize
# frames. Video synchronization can only begin at an intra frame.
VideoGopSize 12
# More MPEG-4 parameters
# VideoHighQuality
# Video4MotionVector
# Choose your codecs:
#AudioCodec mp2
#VideoCodec mpeg1video
# Suppress audio
#NoAudio
# Suppress video
#NoVideo
#VideoQMin 3
#VideoQMax 31
# Set this to the number of seconds backwards in time to start. Note that
# most players will buffer 5-10 seconds of video, and also you need to allow
# for a keyframe to appear in the data stream.
#Preroll 15
# ACL:
# You can allow ranges of addresses (or single addresses)
#ACL ALLOW <first address="address"> <last address="address">
# You can deny ranges of addresses (or single addresses)
#ACL DENY <first address="address"> <last address="address">
# You can repeat the ACL allow/deny as often as you like. It is on a per
# stream basis. The first match defines the action. If there are no matches,
# then the default is the inverse of the last ACL statement.
#
# Thus 'ACL allow localhost' only allows access from localhost.
# 'ACL deny 1.0.0.0 1.255.255.255' would deny the whole of network 1 and
# allow everybody else.
</last></first></last></first></stream>
##################################################################
# Example streams
# Multipart JPEG
#<stream>
#Feed feed1.ffm
#Format mpjpeg
#VideoFrameRate 2
#VideoIntraOnly
#NoAudio
#Strict -1
#</stream>
# Single JPEG
#<stream>
#Feed feed1.ffm
#Format jpeg
#VideoFrameRate 2
#VideoIntraOnly
##VideoSize 352x240
#NoAudio
#Strict -1
#</stream>
# Flash
#<stream>
#Feed feed1.ffm
#Format swf
#VideoFrameRate 2
#VideoIntraOnly
#NoAudio
#</stream>
# ASF compatible
<stream>
Feed feed1.ffm
Format asf
VideoFrameRate 15
VideoSize 352x240
VideoBitRate 256
VideoBufferSize 40
VideoGopSize 30
AudioBitRate 64
StartSendOnKey
</stream>
# MP3 audio
#<stream>
#Feed feed1.ffm
#Format mp2
#AudioCodec mp3
#AudioBitRate 64
#AudioChannels 1
#AudioSampleRate 44100
#NoVideo
#</stream>
# Ogg Vorbis audio
#<stream>
#Feed feed1.ffm
#Title "Stream title"
#AudioBitRate 64
#AudioChannels 2
#AudioSampleRate 44100
#NoVideo
#</stream>
# Real with audio only at 32 kbits
#<stream>
#Feed feed1.ffm
#Format rm
#AudioBitRate 32
#NoVideo
#NoAudio
#</stream>
# Real with audio and video at 64 kbits
#<stream>
#Feed feed1.ffm
#Format rm
#AudioBitRate 32
#VideoBitRate 128
#VideoFrameRate 25
#VideoGopSize 25
#NoAudio
#</stream>
##################################################################
# A stream coming from a file: you only need to set the input
# filename and optionally a new format. Supported conversions:
# AVI -> ASF
#<stream>
#File "/usr/local/httpd/htdocs/tlive.rm"
#NoAudio
#</stream>
#<stream>
#File "/usr/local/httpd/htdocs/test.asf"
#NoAudio
#Author "Me"
#Copyright "Super MegaCorp"
#Title "Test stream from disk"
#Comment "Test comment"
#</stream>
##################################################################
# RTSP examples
#
# You can access this stream with the RTSP URL:
# rtsp://localhost:5454/test1-rtsp.mpg
#
# A non-standard RTSP redirector is also created. Its URL is:
# http://localhost:8090/test1-rtsp.rtsp
#<stream>
#Format rtp
#File "/usr/local/httpd/htdocs/test1.mpg"
#</stream>
##################################################################
# SDP/multicast examples
#
# If you want to send your stream in multicast, you must set the
# multicast address with MulticastAddress. The port and the TTL can
# also be set.
#
# An SDP file is automatically generated by ffserver by adding the
# 'sdp' extension to the stream name (here
# http://localhost:8090/test1-sdp.sdp). You should usually give this
# file to your player to play the stream.
#
# The 'NoLoop' option can be used to avoid looping when the stream is
# terminated.
#<stream>
#Format rtp
#File "/usr/local/httpd/htdocs/test1.mpg"
#MulticastAddress 224.124.0.1
#MulticastPort 5000
#MulticastTTL 16
#NoLoop
#</stream>
##################################################################
# Special streams
# Server status
<stream>
Format status
# Only allow local people to get the status
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
#FaviconURL http://pond1.gladstonefamily.net:8080/favicon.ico
</stream>
# Redirect index.html to the appropriate site
<redirect>
URL http://www.ffmpeg.org/
</redirect>Kindly anyone please assist me whether I am missing something or do i need to change my server.conf file ? I have referred many websites. But still I am unable to fix it. Thanks in advance.
-
Working on images asynchronously
To get my quota on buzzwords for the day we are going to look at using ZeroMQ and Imagick to create a simple asynchronous image processing system. Why asynchronous ? First of all, separating the image handling from a interactive PHP scripts allows us to scale the image processing separately from the web heads. For example we could do the image processing on separate servers, which have SSDs attached and more memory. In this example making the images available to all worker nodes is left to the reader.
Secondly, separating the image processing from a web script can provide more responsive experience to the user. This doesn’t necessarily mean faster, but let’s say in a multiple image upload scenario this method allows the user to do something else on the site while we process the images in the background. This can be beneficial especially in cases where users upload hundreds of images at a time. To achieve a simple distributed image processing infrastructure we are going to use ZeroMQ for communicating between different components and Imagick to work on the images.
The first part we are going to create is a simple “Worker” -process skeleton. Naturally for a live environment you would like to have more error handling and possibly use pcntl for process control, but for the sake of brewity the example is barebones :
-
< ?php
-
-
define (’THUMBNAIL_ADDR’, ’tcp ://127.0.0.1:5000’) ;
-
define (’COLLECTOR_ADDR’, ’tcp ://127.0.0.1:5001’) ;
-
-
class Worker {
-
-
private $in ;
-
private $out ;
-
-
public function __construct ($in_addr, $out_addr)
-
{
-
$context = new ZMQContext () ;
-
-
$this->in = new ZMQSocket ($context, ZMQ: :SOCKET_PULL) ;
-
$this->in->bind ($in_addr) ;
-
-
$this->out = new ZMQSocket ($context, ZMQ: :SOCKET_PUSH) ;
-
$this->out->connect ($out_addr) ;
-
}
-
-
public function work () {
-
while ($command = $this->in->recvMulti ()) {
-
if (isset ($this->commands [$command [0]])) {
-
echo "Received work" . PHP_EOL ;
-
-
$callback = $this->commands [$command [0]] ;
-
-
array_shift ($command) ;
-
$response = call_user_func_array ($callback, $command) ;
-
-
if (is_array ($response))
-
$this->out->sendMulti ($response) ;
-
else
-
$this->out->send ($response) ;
-
}
-
else {
-
error_log ("There is no registered worker for $command [0]") ;
-
}
-
}
-
}
-
-
public function register ($command, $callback)
-
{
-
$this->commands [$command] = $callback ;
-
}
-
}
-
?>
The Worker class allows us to register commands with callbacks associated with them. In our case the Worker class doesn’t actually care or know about the parameters being passed to the actual callback, it just blindly passes them on. We are using two separate sockets in this example, one for incoming work requests and one for passing the results onwards. This allows us to create a simple pipeline by adding more workers in the mix. For example we could first have a watermark worker, which takes the original image and composites a watermark on it, passes the file onwards to thumbnail worker, which then creates different sizes of thumbnails and passes the final results to event collector.
The next part we are going to create a is a simple worker script that does the actual thumbnailing of the images :
-
< ?php
-
include __DIR__ . ’/common.php’ ;
-
-
// Create worker class and bind the inbound address to ’THUMBNAIL_ADDR’ and connect outbound to ’COLLECTOR_ADDR’
-
$worker = new Worker (THUMBNAIL_ADDR, COLLECTOR_ADDR) ;
-
-
// Register our thumbnail callback, nothing special here
-
$worker->register (’thumbnail’, function ($filename, $width, $height) {
-
$info = pathinfo ($filename) ;
-
-
$out = sprintf ("%s/%s_%dx%d.%s",
-
$info [’dirname’],
-
$info [’filename’],
-
$width,
-
$height,
-
$info [’extension’]) ;
-
-
$status = 1 ;
-
$message = ’’ ;
-
-
try {
-
$im = new Imagick ($filename) ;
-
$im->thumbnailImage ($width, $height) ;
-
$im->writeImage ($out) ;
-
}
-
catch (Exception $e) {
-
$status = 0 ;
-
$message = $e->getMessage () ;
-
}
-
-
return array (
-
’status’ => $status,
-
’filename’ => $filename,
-
’thumbnail’ => $out,
-
’message’ => $message,
-
) ;
-
}) ;
-
-
// Run the worker, will block
-
echo "Running thumbnail worker.." . PHP_EOL ;
-
$worker->work () ;
As you can see from the code the thumbnail worker registers a callback for ‘thumbnail’ command. The callback does the thumbnailing based on input and returns the status, original filename and the thumbnail filename. We have connected our Workers “outbound” socket to event collector, which will receive the results from the thumbnail worker and do something with them. What the “something” is depends on you. For example you could push the response into a websocket to show immediate feeedback to the user or store the results into a database.
Our example event collector will just do a var_dump on every event it receives from the thumbnailer :
-
< ?php
-
include __DIR__ . ’/common.php’ ;
-
-
$socket = new ZMQSocket (new ZMQContext (), ZMQ: :SOCKET_PULL) ;
-
$socket->bind (COLLECTOR_ADDR) ;
-
-
echo "Waiting for events.." . PHP_EOL ;
-
while (($message = $socket->recvMulti ())) {
-
var_dump ($message) ;
-
}
-
?>
The final piece of the puzzle is the client that pumps messages into the pipeline. The client connects to the thumbnail worker, passes on filename and desired dimensions :
-
< ?php
-
include __DIR__ . ’/common.php’ ;
-
-
$socket = new ZMQSocket (new ZMQContext (), ZMQ: :SOCKET_PUSH) ;
-
$socket->connect (THUMBNAIL_ADDR) ;
-
-
$socket->sendMulti (
-
array (
-
’thumbnail’,
-
realpath (’./test.jpg’),
-
50,
-
50,
-
)
-
) ;
-
echo "Sent request" . PHP_EOL ;
-
?>
After this our processing pipeline will look like this :
Now, if we notice that thumbnail workers or the event collectors can’t keep up with the rate of images we are pushing through we can start scaling the pipeline by adding more processes on each layer. ZeroMQ PUSH socket will automatically round-robin between all connected nodes, which makes adding more workers and event collectors simple. After adding more workers our pipeline will look like this :
Using ZeroMQ also allows us to create more flexible architectures by adding forwarding devices in the middle, adding request-reply workers etc. So, the last thing to do is to run our pipeline and see the results :
Let’s create our test image first :
$ convert magick:rose test.jpg
From the command-line run the thumbnail script :
$ php thumbnail.php Running thumbnail worker..
In a separate terminal window run the event collector :
$ php collector.php Waiting for events..
And finally run the client to send the thumbnail request :
$ php client.php Sent request $
If everything went according to the plan you should now see the following output in the event collector window :
array(4) [0]=> string(1) "1" [1]=> string(56) "
/test.jpg" [2]=> string(62) " /test_50x50.jpg" [3]=> string(0) "" Happy hacking !
-