Recherche avancée

Médias (0)

Mot : - Tags -/xmp

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (100)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • L’agrémenter visuellement

    10 avril 2011

    MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
    Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté.

Sur d’autres sites (12391)

  • dnn : add a new interface DNNModel.get_output

    11 septembre 2020, par Guo, Yejun
    dnn : add a new interface DNNModel.get_output
    

    for some cases (for example, super resolution), the DNN model changes
    the frame size which impacts the filter behavior, so the filter needs
    to know the out frame size at very beginning.

    Currently, the filter reuses DNNModule.execute_model to query the
    out frame size, it is not clear from interface perspective, so add
    a new explict interface DNNModel.get_output for such query.

    • [DH] libavfilter/dnn/dnn_backend_native.c
    • [DH] libavfilter/dnn/dnn_backend_openvino.c
    • [DH] libavfilter/dnn/dnn_backend_tf.c
    • [DH] libavfilter/dnn_interface.h
    • [DH] libavfilter/vf_dnn_processing.c
    • [DH] libavfilter/vf_sr.c
  • How to simultaneously capture mic, stream it to RTSP server and play it on iPhone's speaker ?

    24 août 2021, par Norbert Towiański

    I want to capture sound from mic, stream it to RTSP server and play it simultaneously on iPhone's speaker after getting samples from RTSP server. I mean such kind of loop. I use FFMPEGKit and I want to use MobileVLCKit, but unfortunately microphone is off when I start play stream.
I think I've done first step (capturing from microphone and send OutputStream to RTSP server) :

    


    @IBAction func transmitBtnPressed(_ sender: Any) {&#xA;    ffmpeg_transmit()&#xA;}&#xA;&#xA;@IBAction func recordBtnPressed(_ sender: Any) {&#xA;    switch recordingState {&#xA;    case .idle:&#xA;        recordingState = .start&#xA;        startRecording()&#xA;        recordBtn.setTitle("Started", for: .normal)&#xA;        let urlToFile = URL(fileURLWithPath: outPipePath!)&#xA;        outputStream = OutputStream(url: urlToFile, append: false)&#xA;        outputStream!.open()&#xA;    case .capturing:&#xA;        recordingState = .end&#xA;        stopRecording()&#xA;        recordBtn.setTitle("End", for: .normal)&#xA;    default:&#xA;        break&#xA;    }&#xA;}&#xA;&#xA;override func viewDidLoad() {&#xA;    super.viewDidLoad()&#xA;    outPipePath = FFmpegKitConfig.registerNewFFmpegPipe()&#xA;    self.setup()&#xA;}&#xA;&#xA;override func viewDidAppear(_ animated: Bool) {&#xA;    super.viewDidAppear(animated)&#xA;    setUpAuthStatus()&#xA;}&#xA;&#xA;func setUpAuthStatus() {&#xA;    if AVCaptureDevice.authorizationStatus(for: AVMediaType.audio) != .authorized {&#xA;        AVCaptureDevice.requestAccess(for: AVMediaType.audio, completionHandler: { (authorized) in&#xA;            DispatchQueue.main.async {&#xA;                if authorized {&#xA;                    self.setup()&#xA;                }&#xA;            }&#xA;        })&#xA;    }&#xA;}&#xA;&#xA;func setup() {&#xA;    self.session.sessionPreset = AVCaptureSession.Preset.high&#xA;    &#xA;    self.recordingURL = URL(fileURLWithPath: "\(NSTemporaryDirectory() as String)/file.m4a")&#xA;    if self.fileManager.isDeletableFile(atPath: self.recordingURL!.path) {&#xA;        _ = try? self.fileManager.removeItem(atPath: self.recordingURL!.path)&#xA;    }&#xA;    &#xA;    self.assetWriter = try? AVAssetWriter(outputURL: self.recordingURL!,&#xA;                                          fileType: AVFileType.m4a)&#xA;    self.assetWriter!.movieFragmentInterval = CMTime.invalid&#xA;    self.assetWriter!.shouldOptimizeForNetworkUse = true&#xA;    &#xA;    let audioSettings = [&#xA;        AVFormatIDKey: kAudioFormatLinearPCM,&#xA;        AVSampleRateKey: 48000.0,&#xA;        AVNumberOfChannelsKey: 1,&#xA;        AVLinearPCMIsFloatKey: false,&#xA;        AVLinearPCMBitDepthKey: 16,&#xA;        AVLinearPCMIsBigEndianKey: false,&#xA;        AVLinearPCMIsNonInterleaved: false,&#xA;        &#xA;    ] as [String : Any]&#xA;    &#xA;    &#xA;    self.audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio,&#xA;                                         outputSettings: audioSettings)&#xA;    &#xA;    self.audioInput?.expectsMediaDataInRealTime = true&#xA;            &#xA;    if self.assetWriter!.canAdd(self.audioInput!) {&#xA;        self.assetWriter?.add(self.audioInput!)&#xA;    }&#xA;    &#xA;    self.session.startRunning()&#xA;    &#xA;    DispatchQueue.main.async {&#xA;        self.session.beginConfiguration()&#xA;        &#xA;        self.session.commitConfiguration()&#xA;        &#xA;        let audioDevice = AVCaptureDevice.default(for: AVMediaType.audio)&#xA;        let audioIn = try? AVCaptureDeviceInput(device: audioDevice!)&#xA;        &#xA;        if self.session.canAddInput(audioIn!) {&#xA;            self.session.addInput(audioIn!)&#xA;        }&#xA;        &#xA;        if self.session.canAddOutput(self.audioOutput) {&#xA;            self.session.addOutput(self.audioOutput)&#xA;        }&#xA;        &#xA;        self.audioConnection = self.audioOutput.connection(with: AVMediaType.audio)&#xA;    }&#xA;}&#xA;&#xA;func startRecording() {&#xA;    if self.assetWriter?.startWriting() != true {&#xA;        print("error: \(self.assetWriter?.error.debugDescription ?? "")")&#xA;    }&#xA;    &#xA;    self.audioOutput.setSampleBufferDelegate(self, queue: self.recordingQueue)&#xA;}&#xA;&#xA;func stopRecording() {&#xA;    self.audioOutput.setSampleBufferDelegate(nil, queue: nil)&#xA;    &#xA;    self.assetWriter?.finishWriting {&#xA;        print("Saved in folder \(self.recordingURL!)")&#xA;    }&#xA;}&#xA;func captureOutput(_ captureOutput: AVCaptureOutput, didOutput&#xA;                    sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {&#xA;    &#xA;    if !self.isRecordingSessionStarted {&#xA;        let presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)&#xA;        self.assetWriter?.startSession(atSourceTime: presentationTime)&#xA;        self.isRecordingSessionStarted = true&#xA;        recordingState = .capturing&#xA;    }&#xA;    &#xA;    var blockBuffer: CMBlockBuffer?&#xA;    var audioBufferList: AudioBufferList = AudioBufferList.init()&#xA;    &#xA;    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, bufferListSizeNeededOut: nil, bufferListOut: &amp;audioBufferList, bufferListSize: MemoryLayout<audiobufferlist>.size, blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment, blockBufferOut: &amp;blockBuffer)&#xA;    let buffers = UnsafeMutableAudioBufferListPointer(&amp;audioBufferList)&#xA;    &#xA;    for buffer in buffers {&#xA;        let u8ptr = buffer.mData!.assumingMemoryBound(to: UInt8.self)&#xA;        let output = outputStream!.write(u8ptr, maxLength: Int(buffer.mDataByteSize))&#xA;        &#xA;        if (output == -1) {&#xA;            let error = outputStream?.streamError&#xA;            print("\(#file) > \(#function) > Error on outputStream: \(error!.localizedDescription)")&#xA;        }&#xA;        else {&#xA;            print("\(#file) > \(#function) > Data sent")&#xA;        }&#xA;    }&#xA;}&#xA;&#xA;func ffmpeg_transmit() {&#xA;    &#xA;    let cmd1: String = "-f s16le -ar 48000 -ac 1 -i "&#xA;    let cmd2: String = " -probesize 32 -analyzeduration 0 -c:a libopus -application lowdelay -ac 1 -ar 48000 -f rtsp -rtsp_transport udp rtsp://localhost:18556/mystream"&#xA;    let cmd = cmd1 &#x2B; outPipePath! &#x2B; cmd2&#xA;    &#xA;    print(cmd)&#xA;    &#xA;    ffmpegSession = FFmpegKit.executeAsync(cmd, withExecuteCallback: { ffmpegSession in&#xA;        &#xA;        let state = ffmpegSession?.getState()&#xA;        let returnCode = ffmpegSession?.getReturnCode()&#xA;        if let returnCode = returnCode, let get = ffmpegSession?.getFailStackTrace() {&#xA;            print("FFmpeg process exited with state \(String(describing: FFmpegKitConfig.sessionState(toString: state!))) and rc \(returnCode).\(get)")&#xA;        }&#xA;    }, withLogCallback: { log in&#xA;        &#xA;    }, withStatisticsCallback: { statistics in&#xA;        &#xA;    })&#xA;}&#xA;</audiobufferlist>

    &#xA;

    I want to use MobileVLCKit in that way :

    &#xA;

    func startStream(){&#xA;    guard let url = URL(string: "rtsp://localhost:18556/mystream") else {return}&#xA;    audioPlayer!.media = VLCMedia(url: url)&#xA;&#xA;    audioPlayer!.media.addOption( "-vv")&#xA;    audioPlayer!.media.addOption( "--network-caching=10000")&#xA;&#xA;    audioPlayer!.delegate = self&#xA;    audioPlayer!.audio.volume = 100&#xA;&#xA;    audioPlayer!.play()&#xA;&#xA;}&#xA;

    &#xA;

    Could you give me some hints how to implement that ?

    &#xA;

  • How to trim video in Android after camera capture ?

    2 juillet 2017, par Sathwik Gangisetty

    I have an app that captures images and videos. I’m able to implement cropping picture, but I cannot implement trimming video. I tried using mp4parser and ffmpeg but I cannot make it work. Can anyone suggest any tutorial to do it or please have a look at my mainActivity.java code and suggest me what to do ? I even tried K4LvideoTrimmer but couldn’t make it work as GitHub documentation is not clear.

    Here is my MainActivity.java file.

    package com.example.sathwik.uploadtrail1;

    import java.io.File;
    import java.io.IOException;
    import java.io.InputStream;
    import java.text.SimpleDateFormat;
    import java.util.Date;
    import java.util.Locale;
    import android.app.Activity;
    import android.content.Context;
    import android.content.DialogInterface;
    import android.content.Intent;
    import android.content.SharedPreferences;
    import android.content.pm.PackageManager;
    import android.database.Cursor;
    import android.graphics.Color;
    import android.graphics.drawable.ColorDrawable;
    import android.net.Uri;
    import android.os.Bundle;
    import android.os.Environment;
    import android.provider.MediaStore;
    import android.support.annotation.NonNull;
    import android.support.v7.app.AppCompatActivity;
    import android.util.Log;
    import android.view.Menu;
    import android.view.MenuItem;
    import android.view.View;
    import android.widget.Button;
    import android.widget.Toast;

    import com.yalantis.ucrop.UCrop;

    import life.knowledge4.videotrimmer.K4LVideoTrimmer;

    import static android.R.attr.path;

    public class MainActivity extends AppCompatActivity {
       // LogCat tag
       private static final String TAG = MainActivity.class.getSimpleName();
       // Camera activity request codes
       private static final int CAMERA_CAPTURE_IMAGE_REQUEST_CODE = 100;
       private static final int CAMERA_CAPTURE_VIDEO_REQUEST_CODE = 200;
       public static final int MEDIA_TYPE_IMAGE = 1;
       public static final int MEDIA_TYPE_VIDEO = 2;
       final int PIC_CROP = 3;
       final int VIDEO_TRIM = 4;
       private File outputFile = null;
      // public static final int REQUEST_CROP = UCrop.REQUEST_CROP;

       private Uri fileUri; // file url to store image/video
      // private Uri fileUriCrop;
       private Button btnCapturePicture, btnRecordVideo;

       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           setContentView(R.layout.activity_main);

           // Changing action bar background color

           //getActionBar().setBackgroundDrawable(new ColorDrawable(Color.parseColor(getResources().getString(R.color.action_bar))));
           btnCapturePicture = (Button) findViewById(R.id.btnCapturePicture);
           btnRecordVideo = (Button) findViewById(R.id.btnRecordVideo);

           /**
           * Capture image button click event
           */
           btnCapturePicture.setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) {

               captureImage();
            }
           });
           /**
             Record video button click event
            */
           btnRecordVideo.setOnClickListener(new View.OnClickListener() {
               @Override
               public void onClick(View v) {
               // record video
               recordVideo();
               }
               });
           // Checking camera availability
           if (!isDeviceSupportCamera()) {
           Toast.makeText(getApplicationContext(),
           "Sorry! Your device doesn't support camera",
           Toast.LENGTH_LONG).show();
           // will close the app if the device does't have camera
           finish();
           }
           }
       /**
         checking device has camera hardware or not
          */
       private boolean isDeviceSupportCamera() {
       if (getApplicationContext().getPackageManager().hasSystemFeature(
       PackageManager.FEATURE_CAMERA)) {
       // this device has a camera
       return true;
       } else {
       // no camera on this device
       return false;
       }
       }
       /**
             * Launching camera app to capture image
             */
       private void captureImage() {
           Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
           fileUri = getOutputMediaFileUri(MEDIA_TYPE_IMAGE);
           intent.putExtra(MediaStore.EXTRA_OUTPUT, fileUri);
           // start the image capture Intent
           startActivityForResult(intent, CAMERA_CAPTURE_IMAGE_REQUEST_CODE);

       }


       /**
             * Launching camera app to record video
             */
       private void recordVideo() {
           Intent intent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE);
           fileUri = getOutputMediaFileUri(MEDIA_TYPE_VIDEO);
           // set video quality
           intent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 1);
           intent.putExtra(MediaStore.EXTRA_OUTPUT, fileUri);// set the image file
           // name
           // start the video capture Intent
           startActivityForResult(intent, CAMERA_CAPTURE_VIDEO_REQUEST_CODE);
       }

      /* *//** Perform UCrop *//*
       public void performUcrop(){
           UCrop.of(fileUri, fileUri).start(this);

       }*/
       private void performcrop(){
          // UCrop.of(fileUri, fileUri).start(this);
           Intent cropIntent = new Intent("com.android.camera.action.CROP");
           cropIntent.setDataAndType(fileUri, "image/*");

           cropIntent.putExtra("crop", "true");

           cropIntent.putExtra("aspectX", 1);
           cropIntent.putExtra("aspectY", 1);

           cropIntent.putExtra("outputX", 256);
           cropIntent.putExtra("outputY", 256);
           cropIntent.putExtra("return-data", true);
           //cropIntent.putExtra(MediaStore.EXTRA_OUTPUT, Uri.fromFile(outputFile));
           startActivityForResult(cropIntent, PIC_CROP);

       }



       /**
                 * Here we store the file url as it will be null after returning from camera
                 * app
                 */
           @Override
           protected void onSaveInstanceState(Bundle outState) {
               super.onSaveInstanceState(outState);
               // save file url in bundle as it will be null on screen orientation
               // changes
               outState.putParcelable("file_uri", fileUri);
           }
           @Override
           protected void onRestoreInstanceState (@NonNull Bundle savedInstanceState){
               super.onRestoreInstanceState(savedInstanceState);
               // get the file url
               fileUri = savedInstanceState.getParcelable("file_uri");
           }
           /**
                 * Receiving activity result method will be called after closing the camera
                 * */
           @Override
           protected void onActivityResult ( int requestCode, int resultCode, Intent data){
               // if the result is capturing Image
               if (requestCode == CAMERA_CAPTURE_IMAGE_REQUEST_CODE) {
                   if (resultCode == RESULT_OK) {
                       // successfully captured the image
                       //Log.d("ucrop", "error log" + fileUri);
                      // performUcrop();
                       //final Uri fileUri = UCrop.getOutput(data);
                       //fileUri = data.getData();
                       performcrop();

                       // launching upload activity

                   } else if (resultCode == RESULT_CANCELED) {
                       // user cancelled Image capture
                       Toast.makeText(getApplicationContext(),
                               "Sorry! Failed to capture image", Toast.LENGTH_SHORT).show();
                   }
               } else if (requestCode == CAMERA_CAPTURE_VIDEO_REQUEST_CODE) {
                   if (resultCode == RESULT_OK) {
                       // video successfully recorded
                       //launching upload activity
                       //videotrim();
                       launchUploadActivity(false);
                   } else if (resultCode == RESULT_CANCELED) {
                       // user cancelled recording
                       Toast.makeText(getApplicationContext(), "Sorry! Failed to record video", Toast.LENGTH_SHORT).show();
                   }
               }

               else if (requestCode == PIC_CROP) {
                   if (resultCode == RESULT_OK) {
                       //fileUri = Uri.fromFile(outputFile);
                       launchUploadActivity(true);
                   }
               }

               }
       private void launchUploadActivity(boolean isImage){
           Intent i = new Intent(MainActivity.this, UploadActivity.class);
           i.putExtra("filePath", fileUri.getPath());
           i.putExtra("isImage", isImage);
           startActivity(i);
       }

       /**
             * Creating file uri to store image/video
             */
       public Uri getOutputMediaFileUri(int type) {
       return Uri.fromFile(getOutputMediaFile(type));
       }

       /**
             * returning image / video
             */
       private static File getOutputMediaFile(int type) {

           // External sdcard location
           File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES),
                   Config.IMAGE_DIRECTORY_NAME);

           // Create the storage directory if it does not exist
           if (!mediaStorageDir.exists()) {
               if (!mediaStorageDir.mkdirs()) {
                   Log.d(TAG, "Oops! Failed create "+ Config.IMAGE_DIRECTORY_NAME + " directory");
                   return null;
               }
           }
           // Create a media file name
           String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss",Locale.getDefault()).format(new Date());
           File mediaFile;
           if (type == MEDIA_TYPE_IMAGE) {
               mediaFile = new File(mediaStorageDir.getPath() + File.separator+ "IMG_" + timeStamp + ".jpg");
           } else if (type == MEDIA_TYPE_VIDEO) {
               mediaFile = new File(mediaStorageDir.getPath() + File.separator+ "VID_" + timeStamp + ".mp4");
           } else {
               return null;
           }
           return mediaFile;
       }

       //Logout function
       private void logout(){
           //Creating an alert dialog to confirm logout
           android.support.v7.app.AlertDialog.Builder alertDialogBuilder = new android.support.v7.app.AlertDialog.Builder(this);
           alertDialogBuilder.setMessage("Are you sure you want to logout?");
           alertDialogBuilder.setPositiveButton("Yes",
                   new DialogInterface.OnClickListener() {
                       @Override
                       public void onClick(DialogInterface arg0, int arg1) {

                           //Getting out sharedpreferences
                           SharedPreferences preferences = getSharedPreferences(Config.SHARED_PREF_NAME, Context.MODE_PRIVATE);
                           //Getting editor
                           SharedPreferences.Editor editor = preferences.edit();

                           //Puting the value false for loggedin
                           editor.putBoolean(Config.LOGGEDIN_SHARED_PREF, false);

                           //Putting blank value to email
                           editor.putString(Config.EMAIL_SHARED_PREF, "");

                           //Saving the sharedpreferences
                           editor.commit();

                           //Starting login activity
                           Intent intent = new Intent(MainActivity.this, LoginActivity.class);
                           startActivity(intent);
                       }
                   });

           alertDialogBuilder.setNegativeButton("No",
                   new DialogInterface.OnClickListener() {
                       @Override
                       public void onClick(DialogInterface arg0, int arg1) {

                       }
                   });

           //Showing the alert dialog
           android.support.v7.app.AlertDialog alertDialog = alertDialogBuilder.create();
           alertDialog.show();

       }

       @Override
       public boolean onCreateOptionsMenu(Menu menu) {
           getMenuInflater().inflate(R.menu.main, menu);
           return true;
       }

       @Override
       public boolean onOptionsItemSelected(MenuItem item) {
           int id = item.getItemId();
           if (id == R.id.menuLogout) {
               logout();
           }
           return super.onOptionsItemSelected(item);
       }

    }