Recherche avancée

Médias (91)

Autres articles (43)

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Soumettre améliorations et plugins supplémentaires

    10 avril 2011

    Si vous avez développé une nouvelle extension permettant d’ajouter une ou plusieurs fonctionnalités utiles à MediaSPIP, faites le nous savoir et son intégration dans la distribution officielle sera envisagée.
    Vous pouvez utiliser la liste de discussion de développement afin de le faire savoir ou demander de l’aide quant à la réalisation de ce plugin. MediaSPIP étant basé sur SPIP, il est également possible d’utiliser le liste de discussion SPIP-zone de SPIP pour (...)

Sur d’autres sites (8003)

  • Reading a File while it is being written by FFMPEG

    29 mai 2014, par Haris Tasawar

    As the title suggests, i am in the process of creating a client/server application where the server(PHP) reads a file which is being written on by ffmpeg and then outputs it to the client(JAVA). I have succeeded in writing the server script in php which initiates the ffmpeg and then after a while starts to read the file and concurrently sends it to the JAVA client, the problem is that after a while the client stops to receive any sort of data and then just quits, For example if i have a 5MB file, it would read only 12KB and then it quits. Could someone tell what can be the issue here ? Is it on the server side or on the client side.
    For References i am attaching both the reading of file code in php and the client side code.

    Code For Reading the file while FFMPEG converts the file(PHP) :

    $file = 'D:\\'.$destination;
    $fp = @fopen($file, 'r');
    //ob_end_clean();  
    header('Content-Description: File Transfer');
    header('Content-Type: application/octet-stream');
    header("Cache-control: private");
    header('Pragma: private');
    header("Expires: Mon, 26 Jul 1997 05:00:00 GMT");
    header('Content-Length: ' . filesize($file));

    $buffer = 4096;
    while(!feof($fp)) {

    usleep(300000);
    echo fread($fp, $buffer);
    ob_flush();
    flush();


    }
    fclose($fp);
    exit();

    Code For reading the File transferred by PHP (JAVA) :

    URL u = new URL(streamURL);

                   clientSocket = (HttpURLConnection) u.openConnection();

                   if( clientSocket.getResponseCode() == HttpURLConnection.HTTP_OK ){
                   //clientSocket.setReadTimeout(0);
                   inRemoteStream = clientSocket.getInputStream();


                   while (((count = inRemoteStream.read(buf)) != -1)) {
                       offset += count;
                       System.out.println(offset);
                       fileOutputStream.write(buf, 0, count);
                       setVideoOffset(offset, contentLength);
                   }

    I`ll be very happy if someone can solve this problem for me :) .

  • How to simultaneously capture mic, stream it to RTSP server and play it on iPhone's speaker ?

    24 août 2021, par Norbert Towiański

    I want to capture sound from mic, stream it to RTSP server and play it simultaneously on iPhone's speaker after getting samples from RTSP server. I mean such kind of loop. I use FFMPEGKit and I want to use MobileVLCKit, but unfortunately microphone is off when I start play stream.
I think I've done first step (capturing from microphone and send OutputStream to RTSP server) :

    


    @IBAction func transmitBtnPressed(_ sender: Any) {&#xA;    ffmpeg_transmit()&#xA;}&#xA;&#xA;@IBAction func recordBtnPressed(_ sender: Any) {&#xA;    switch recordingState {&#xA;    case .idle:&#xA;        recordingState = .start&#xA;        startRecording()&#xA;        recordBtn.setTitle("Started", for: .normal)&#xA;        let urlToFile = URL(fileURLWithPath: outPipePath!)&#xA;        outputStream = OutputStream(url: urlToFile, append: false)&#xA;        outputStream!.open()&#xA;    case .capturing:&#xA;        recordingState = .end&#xA;        stopRecording()&#xA;        recordBtn.setTitle("End", for: .normal)&#xA;    default:&#xA;        break&#xA;    }&#xA;}&#xA;&#xA;override func viewDidLoad() {&#xA;    super.viewDidLoad()&#xA;    outPipePath = FFmpegKitConfig.registerNewFFmpegPipe()&#xA;    self.setup()&#xA;}&#xA;&#xA;override func viewDidAppear(_ animated: Bool) {&#xA;    super.viewDidAppear(animated)&#xA;    setUpAuthStatus()&#xA;}&#xA;&#xA;func setUpAuthStatus() {&#xA;    if AVCaptureDevice.authorizationStatus(for: AVMediaType.audio) != .authorized {&#xA;        AVCaptureDevice.requestAccess(for: AVMediaType.audio, completionHandler: { (authorized) in&#xA;            DispatchQueue.main.async {&#xA;                if authorized {&#xA;                    self.setup()&#xA;                }&#xA;            }&#xA;        })&#xA;    }&#xA;}&#xA;&#xA;func setup() {&#xA;    self.session.sessionPreset = AVCaptureSession.Preset.high&#xA;    &#xA;    self.recordingURL = URL(fileURLWithPath: "\(NSTemporaryDirectory() as String)/file.m4a")&#xA;    if self.fileManager.isDeletableFile(atPath: self.recordingURL!.path) {&#xA;        _ = try? self.fileManager.removeItem(atPath: self.recordingURL!.path)&#xA;    }&#xA;    &#xA;    self.assetWriter = try? AVAssetWriter(outputURL: self.recordingURL!,&#xA;                                          fileType: AVFileType.m4a)&#xA;    self.assetWriter!.movieFragmentInterval = CMTime.invalid&#xA;    self.assetWriter!.shouldOptimizeForNetworkUse = true&#xA;    &#xA;    let audioSettings = [&#xA;        AVFormatIDKey: kAudioFormatLinearPCM,&#xA;        AVSampleRateKey: 48000.0,&#xA;        AVNumberOfChannelsKey: 1,&#xA;        AVLinearPCMIsFloatKey: false,&#xA;        AVLinearPCMBitDepthKey: 16,&#xA;        AVLinearPCMIsBigEndianKey: false,&#xA;        AVLinearPCMIsNonInterleaved: false,&#xA;        &#xA;    ] as [String : Any]&#xA;    &#xA;    &#xA;    self.audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio,&#xA;                                         outputSettings: audioSettings)&#xA;    &#xA;    self.audioInput?.expectsMediaDataInRealTime = true&#xA;            &#xA;    if self.assetWriter!.canAdd(self.audioInput!) {&#xA;        self.assetWriter?.add(self.audioInput!)&#xA;    }&#xA;    &#xA;    self.session.startRunning()&#xA;    &#xA;    DispatchQueue.main.async {&#xA;        self.session.beginConfiguration()&#xA;        &#xA;        self.session.commitConfiguration()&#xA;        &#xA;        let audioDevice = AVCaptureDevice.default(for: AVMediaType.audio)&#xA;        let audioIn = try? AVCaptureDeviceInput(device: audioDevice!)&#xA;        &#xA;        if self.session.canAddInput(audioIn!) {&#xA;            self.session.addInput(audioIn!)&#xA;        }&#xA;        &#xA;        if self.session.canAddOutput(self.audioOutput) {&#xA;            self.session.addOutput(self.audioOutput)&#xA;        }&#xA;        &#xA;        self.audioConnection = self.audioOutput.connection(with: AVMediaType.audio)&#xA;    }&#xA;}&#xA;&#xA;func startRecording() {&#xA;    if self.assetWriter?.startWriting() != true {&#xA;        print("error: \(self.assetWriter?.error.debugDescription ?? "")")&#xA;    }&#xA;    &#xA;    self.audioOutput.setSampleBufferDelegate(self, queue: self.recordingQueue)&#xA;}&#xA;&#xA;func stopRecording() {&#xA;    self.audioOutput.setSampleBufferDelegate(nil, queue: nil)&#xA;    &#xA;    self.assetWriter?.finishWriting {&#xA;        print("Saved in folder \(self.recordingURL!)")&#xA;    }&#xA;}&#xA;func captureOutput(_ captureOutput: AVCaptureOutput, didOutput&#xA;                    sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {&#xA;    &#xA;    if !self.isRecordingSessionStarted {&#xA;        let presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)&#xA;        self.assetWriter?.startSession(atSourceTime: presentationTime)&#xA;        self.isRecordingSessionStarted = true&#xA;        recordingState = .capturing&#xA;    }&#xA;    &#xA;    var blockBuffer: CMBlockBuffer?&#xA;    var audioBufferList: AudioBufferList = AudioBufferList.init()&#xA;    &#xA;    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, bufferListSizeNeededOut: nil, bufferListOut: &amp;audioBufferList, bufferListSize: MemoryLayout<audiobufferlist>.size, blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment, blockBufferOut: &amp;blockBuffer)&#xA;    let buffers = UnsafeMutableAudioBufferListPointer(&amp;audioBufferList)&#xA;    &#xA;    for buffer in buffers {&#xA;        let u8ptr = buffer.mData!.assumingMemoryBound(to: UInt8.self)&#xA;        let output = outputStream!.write(u8ptr, maxLength: Int(buffer.mDataByteSize))&#xA;        &#xA;        if (output == -1) {&#xA;            let error = outputStream?.streamError&#xA;            print("\(#file) > \(#function) > Error on outputStream: \(error!.localizedDescription)")&#xA;        }&#xA;        else {&#xA;            print("\(#file) > \(#function) > Data sent")&#xA;        }&#xA;    }&#xA;}&#xA;&#xA;func ffmpeg_transmit() {&#xA;    &#xA;    let cmd1: String = "-f s16le -ar 48000 -ac 1 -i "&#xA;    let cmd2: String = " -probesize 32 -analyzeduration 0 -c:a libopus -application lowdelay -ac 1 -ar 48000 -f rtsp -rtsp_transport udp rtsp://localhost:18556/mystream"&#xA;    let cmd = cmd1 &#x2B; outPipePath! &#x2B; cmd2&#xA;    &#xA;    print(cmd)&#xA;    &#xA;    ffmpegSession = FFmpegKit.executeAsync(cmd, withExecuteCallback: { ffmpegSession in&#xA;        &#xA;        let state = ffmpegSession?.getState()&#xA;        let returnCode = ffmpegSession?.getReturnCode()&#xA;        if let returnCode = returnCode, let get = ffmpegSession?.getFailStackTrace() {&#xA;            print("FFmpeg process exited with state \(String(describing: FFmpegKitConfig.sessionState(toString: state!))) and rc \(returnCode).\(get)")&#xA;        }&#xA;    }, withLogCallback: { log in&#xA;        &#xA;    }, withStatisticsCallback: { statistics in&#xA;        &#xA;    })&#xA;}&#xA;</audiobufferlist>

    &#xA;

    I want to use MobileVLCKit in that way :

    &#xA;

    func startStream(){&#xA;    guard let url = URL(string: "rtsp://localhost:18556/mystream") else {return}&#xA;    audioPlayer!.media = VLCMedia(url: url)&#xA;&#xA;    audioPlayer!.media.addOption( "-vv")&#xA;    audioPlayer!.media.addOption( "--network-caching=10000")&#xA;&#xA;    audioPlayer!.delegate = self&#xA;    audioPlayer!.audio.volume = 100&#xA;&#xA;    audioPlayer!.play()&#xA;&#xA;}&#xA;

    &#xA;

    Could you give me some hints how to implement that ?

    &#xA;

  • HTML5 video player doesn't work on iPad/iPhone

    16 avril 2015, par Avrohom Yisroel

    NOTE This turned out to be a simulator problem, not a video encoding problem, see my edit lower down...

    I’m creating a web site for a local college, and they want to be able to add short videos that people can view online. I’ve spent quite a bit of time trying to work out how to get the videos to play on iDevices, but have failed.

    I’m using Video.js (http://www.videojs.com), and have HTML that looks like this...

    <video class="video-js vjs-default-skin" controls="controls" preload="auto" width="640" height="352" poster="/Content/Images/logobg.png" data-setup="{}">
     <source src="/Content/Videos/video.m4v" type="video/mp4">
     <source src="/Content/Videos/video.webm" type="video/webm">
     <p class="vjs-no-js">
       To view this video please enable JavaScript, and consider upgrading to a web browser that <a href="http://videojs.com/html5-video-support/" target="_blank">supports HTML5 video</a>
     </p>
    </source></source></video>

    This works fine on desktop browsers, where it uses the m4v file. However, if I load the page on an iDevice, the video player says "no compatible source was found for this video", which sounds like it doesn’t like either the m4v or the webm file.

    I created the webm file using instructions found at http://daniemon.com/blog/how-to-convert-videos-to-webm-with-ffmpeg/. I tried creating a .mov file using the accepted answer at iPad Doesn’t Render H.264 Video with HTML5, but this gave the same error.

    Anyone any ideas how I can support iDevices ? Please don’t blind me with science, I’m a real newbie with all this video stuff, and need simple instructions !

    Edit The problem I was having was when trying to view the site on a mobile simulator. When I uploaded the site to a real server, and tried it on an iPad, it worked fine. So, if anyone is having a similar problem, first use something like Handbrake to encode the videos, as it seems to do it fine, then make sure you’re testing on a real mobile device, not a simulator !