Recherche avancée

Médias (1)

Mot : - Tags -/publishing

Autres articles (40)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Demande de création d’un canal

    12 mars 2010, par

    En fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
    Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...)

Sur d’autres sites (7429)

  • Choppy Audio while playing Video from StreamingAssets in Unity's VideoPlayer

    8 novembre 2017, par Saad Anees

    I have been trying to load video that I recorded from AVPro Movie Capture (Free Version). The video file was in GB so I converted it using ffmpeg command -y -i RawVideo.avi -qscale 7 FinalVideo.avi and saved it to StreamingAssets. I got the desired result. Now I want to play that converted video file in video player for preview. But the problem is when video is played from URL the audio is very choppy. I played it in windows player and VLC and it was fine. The problem occurs only in Unity’s VideoPlayer.

    PreviewVideo Class :

    public class PreviewVideo : MonoBehaviour
    {

       public GameObject VideoSelection;
       public GameObject RecordingCanvas;
       public GameObject FacebookCanvas;
       public GameObject Home;
       public Sprite pauseImage;
       public Sprite playImage;
       public VideoPlayer videoPlayer;
       public GameObject EmailCanvas;

       // Use this for initialization
       void Start ()
       {

       }

       public void Referesh()
       {
           videoPlayer.gameObject.GetComponent<spriterenderer> ().sprite = Resources.Load<sprite> ("Thumbnails/" + StaticVariables.VideoToPlay);

           videoPlayer.url = Application.streamingAssetsPath + "/FinalVideo.avi";
       }

       public void PlayVideo()
       {
           if (!videoPlayer.isPlaying) {
               videoPlayer.Play ();
           }
       }

       public void Back ()
       {
           this.gameObject.SetActive (false);
           VideoSelection.SetActive (true);
       }

       public void HomeBtn ()
       {
           SceneManager.LoadScene (0);
       }

       public void SendEmailDialogue()
       {
           EmailCanvas.SetActive (true);
           this.gameObject.SetActive (false);
       }

       public void FacebookShare()
       {
           FacebookCanvas.SetActive (true);
       }
    }
    </sprite></spriterenderer>

    Refresh() is called from RecordingCanvas class :

    public class RecordingCanvas : MonoBehaviour {

       public GameObject VideoSelection;
       public GameObject PreviewVideo;
       public GameObject Home;
       public GameObject canvas;
       public RawImage rawImage;
       public GameObject videoThumbnail;
       float _seconds;
       bool canStart = false;
       public SpriteRenderer NumSprite;
       public VideoPlayer videoPlayer;
       WebCamTexture webcamTexture;
       Process process;
       void Start ()
       {
           Refresh ();
       }

       public void Refresh()
       {
           _seconds = 0;
           NumSprite.gameObject.SetActive(true);
           webcamTexture = new WebCamTexture (1280, 720);
           webcamTexture.Stop ();
           rawImage.texture = webcamTexture;
           rawImage.material.mainTexture = webcamTexture;
           webcamTexture.Play ();
           videoPlayer.loopPointReached += VideoEndReached;

           videoPlayer.gameObject.GetComponent<spriterenderer> ().sprite = Resources.Load<sprite> ("Thumbnails/" + StaticVariables.VideoToPlay);
           videoThumbnail.GetComponent<spriterenderer> ().sprite = Resources.Load<sprite> ("Thumbnails/" + StaticVariables.VideoToPlay);
           videoPlayer.clip = Resources.Load<videoclip> ("Videos/" + StaticVariables.VideoToPlay);
       }

       void Update()
       {
           _seconds += Time.deltaTime;
           print ((int)_seconds);
           if (_seconds &lt; 1) {
               NumSprite.sprite = Resources.Load<sprite> ("Numbers/3");
           }
           else if(_seconds &lt;2)
               NumSprite.sprite = Resources.Load<sprite>("Numbers/2");
           else if(_seconds &lt;3)
               NumSprite.sprite = Resources.Load<sprite>("Numbers/1");


           if (_seconds >= 3 &amp;&amp; _seconds &lt;=4 ) {
               canStart = true;
           }

           if (canStart) {
               NumSprite.gameObject.SetActive(false);
               canStart = false;
               FindObjectOfType<capturegui> ().StartCapture();
               videoPlayer.Play ();
               videoThumbnail.SetActive (false);
           }

       }

       IEnumerator StartConversion()
       {
           yield return new WaitForSeconds (1.5f);
           process = new Process();

           if (File.Exists (Application.streamingAssetsPath + "/FinalVideo.avi"))
               File.Delete(Application.streamingAssetsPath + "/FinalVideo.avi");

           process.StartInfo.WorkingDirectory = Application.streamingAssetsPath;
           process.StartInfo.FileName = Application.streamingAssetsPath + "/ffmpeg.exe";
           process.StartInfo.Arguments = " -y -i " + StaticVariables.RawVideo + ".avi " + "-qscale 7 " + StaticVariables.FinalVideo + ".avi";
           process.StartInfo.CreateNoWindow = false;
           process.EnableRaisingEvents = true;
           process.Exited += new EventHandler(Process_Exited);
           process.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
           process.Start();
           process.WaitForExit ();
           canvas.SetActive (false);
           PreviewVideo.SetActive (true);
           FindObjectOfType<previewvideo> ().Referesh ();
           File.Copy (Application.streamingAssetsPath + "/FinalVideo.avi", @"C:\xampp\htdocs\facebook\images\FinalVideo.avi", true);
           this.gameObject.SetActive (false);
       }

       void Process_Exited(object sender, EventArgs e)
       {
           process.Dispose ();
       }

       void VideoEndReached(UnityEngine.Video.VideoPlayer vp)
       {
           videoPlayer.Stop ();
           FindObjectOfType<capturegui> ().StopCapture();
           webcamTexture.Stop ();
           canvas.SetActive (true);
           StartCoroutine(StartConversion ());
       }

    }
    </capturegui></previewvideo></capturegui></sprite></sprite></sprite></videoclip></sprite></spriterenderer></sprite></spriterenderer>

    I am using Unity version 2017.1.1p4 personal edition. Windows 10 with high end PC. I am making this app for standalone PC.

    I am stuck here. Can’t proceed further. Please help me with this issue.

  • Build FFMPEG with x264 for Android

    12 novembre 2016, par Kage

    I am trying to build FFMPEG with libx264 for Android.

    I can successfully build and use FFMPEG for Android but I realized that I need the ability to encode, therefore I am trying to build FFMPEG with x264.

    I am using this tutorial to build FFmpeg for Android http://www.roman10.net/how-to-build-ffmpeg-for-android/

    When trying to build FFMPEG I get an error :

    "ERROR : libx264 not found"

    And in my log it says :

    "/usr/local/lib/libx264.a : could not read symbols : Archive has no
    index ; run ranlib to add one..."

    I have the latest versions of both FFMPEG and x264.
    I understand that FFMPEG looks for the header and libraries in usr/lib and usr/include, so in order to make it find x264 I use the cflags and ldflags :

    • —extra-cflags = " -I/usr/local/include "
    • —extra-ldflags = " -L/usr/local/lib "

    I have tried building x264 with many different options that other people on the internet have said that i need. eg. —enable-shared, —enable-static, —disable-pthreads etc.
    Some forums say enable this, others say no disable that.

    Any help would be much appreciated,
    Thanks

    EDIT :

    If I build FFmpeg with the simplest commands to include libx264 then it works.
    ie.

    ./configure --enable-gpl --enable-libx264 --extra-cflags="-I/usr/local/include" --extra-ldflags="-L/usr/local/lib" --enable-static --enable-shared

    However I need it to work for Android. The script I am using is :

    NDK=~/Desktop/android-ndk-r7
    PLATFORM=$NDK/platforms/android-8/arch-arm/
    PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/darwin-x86
    function build_one
    {
    ./configure --target-os=linux \
       --prefix=$PREFIX \
       --enable-cross-compile \
       --enable-shared \
       --enable-static \
       --extra-libs="-lgcc" \
       --arch=arm \
       --cc=$PREBUILT/bin/arm-linux-androideabi-gcc \
       --cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \
       --nm=$PREBUILT/bin/arm-linux-androideabi-nm \
       --sysroot=$PLATFORM \
       --extra-cflags=" -O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 $OPTIMIZE_CFLAGS -I/usr/local/include" \
       --extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L $PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog -L/usr/local/lib " \
       --enable-gpl \
       --enable-libx264 \
       --disable-everything \
       --enable-demuxer=mov \
       --enable-demuxer=h264 \
       --disable-ffplay \
       --enable-protocol=file \
       --enable-avformat \
       --enable-avcodec \
       --enable-decoder=rawvideo \
       --enable-decoder=mjpeg \
       --enable-decoder=h263 \
       --enable-decoder=mpeg4 \
       --enable-decoder=h264 \
       --enable-encoder=mjpeg \
       --enable-encoder=h263 \
       --enable-encoder=mpeg4 \
       --enable-encoder=h264 \
       --enable-parser=h264 \
       --disable-network \
       --enable-zlib \
       --disable-avfilter \
       --disable-avdevice \
       $ADDITIONAL_CONFIGURE_FLAG

    make clean
    make  -j4 install
    $PREBUILT/bin/arm-linux-androideabi-ar d libavcodec/libavcodec.a inverse.o
    $PREBUILT/bin/arm-linux-androideabi-ld -rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib  -soname libffmpeg.so -shared -nostdlib  -z,noexecstack -Bsymbolic --whole-archive --no-undefined -o $PREFIX/libffmpeg.so libavcodec/libavcodec.a libavformat/libavformat.a libavutil/libavutil.a libswscale/libswscale.a -lc -lm -lz -ldl -llog  --warn-once  --dynamic-linker=/system/bin/linker $PREBUILT/lib/gcc/arm-linux-androideabi/4.4.3/libgcc.a
    }

    CPU=armv7-a
    OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=$CPU "
    PREFIX=./android/$CPU
    ADDITIONAL_CONFIGURE_FLAG=
    build_one

    I am guessing that some option in my configure command is conflicting with enabling libx264

    NOTE : If I remove —enable-libx264 then it works

  • Video Conferencing in HTML5 : WebRTC via Web Sockets

    1er janvier 2014, par silvia

    A bit over a week ago I gave a presentation at Web Directions Code 2012 in Melbourne. Maxine and John asked me to speak about something related to HTML5 video, so I went for the new shiny : WebRTC – real-time communication in the browser.

    Presentation slides

    I only had 20 min, so I had to make it tight. I wanted to show off video conferencing without special plugins in Google Chrome in just a few lines of code, as is the promise of WebRTC. To a large extent, I achieved this. But I made some interesting discoveries along the way. Demos are in the slide deck.

    UPDATE : Opera 12 has been released with WebRTC support.

    Housekeeping : if you want to replicate what I have done, you need to install a Google Chrome Web Browser 19+. Then make sure you go to chrome ://flags and activate the MediaStream and PeerConnection experiment(s). Restart your browser and now you can experiment with this feature. Big warning up-front : it’s not production-ready, since there are still changes happening to the spec and there is no compatible implementation by another browser yet.

    Here is a brief summary of the steps involved to set up video conferencing in your browser :

    1. Set up a video element each for the local and the remote video stream.
    2. Grab the local camera and stream it to the first video element.
    3. (*) Establish a connection to another person running the same Web page.
    4. Send the local camera stream on that peer connection.
    5. Accept the remote camera stream into the second video element.

    Now, the most difficult part of all of this – believe it or not – is the signalling part that is required to build the peer connection (marked with (*)). Initially I wanted to run completely without a server and just enter the remote’s IP address to establish the connection. This is, however, not a functionality that the PeerConnection object provides [might this be something to add to the spec ?].

    So, you need a server known to both parties that can provide for the handshake to set up the connection. All the examples that I have seen, such as https://apprtc.appspot.com/, use a channel management server on Google’s appengine. I wanted it all working with HTML5 technology, so I decided to use a Web Socket server instead.

    I implemented my Web Socket server using node.js (code of websocket server). The video conferencing demo is in the slide deck in an iframe – you can also use the stand-alone html page. Works like a treat.

    While it is still using Google’s STUN server to get through NAT, the messaging for setting up the connection is running completely through the Web Socket server. The messages that get exchanged are plain SDP message packets with a session ID. There are OFFER, ANSWER, and OK packets exchanged for each streaming direction. You can see some of it in the below image :

    WebRTC demo

    I’m not running a public WebSocket server, so you won’t be able to see this part of the presentation working. But the local loopback video should work.

    At the conference, it all went without a hitch (while the wireless played along). I believe you have to host the WebSocket server on the same machine as the Web page, otherwise it won’t work for security reasons.

    A whole new world of opportunities lies out there when we get the ability to set up video conferencing on every Web page – scary and exciting at the same time !