
Recherche avancée
Autres articles (60)
-
Pas question de marché, de cloud etc...
10 avril 2011Le vocabulaire utilisé sur ce site essaie d’éviter toute référence à la mode qui fleurit allègrement
sur le web 2.0 et dans les entreprises qui en vivent.
Vous êtes donc invité à bannir l’utilisation des termes "Brand", "Cloud", "Marché" etc...
Notre motivation est avant tout de créer un outil simple, accessible à pour tout le monde, favorisant
le partage de créations sur Internet et permettant aux auteurs de garder une autonomie optimale.
Aucun "contrat Gold ou Premium" n’est donc prévu, aucun (...) -
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...) -
Selection of projects using MediaSPIP
2 mai 2011, parThe examples below are representative elements of MediaSPIP specific uses for specific projects.
MediaSPIP farm @ Infini
The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...)
Sur d’autres sites (5784)
-
How to call ffmpeg video command in android
22 mars 2021, par connor449I am trying to use this command :


FFmpeg.execute("-f android_camera -i 0:0 -r 30 -pixel_format bgr0 -t 00:00:05 <record file="file" path="path">");</record>


from this library :


https://github.com/tanersener/mobile-ffmpeg


The command tells the camera to start recording according to the specifications defined in the command.


As a brand new android developer, I have the following question :


Where/how do I add this command into my code ? I am following the basic tutorial on android that shows you how to make an app with a text field and a button. I want to use the button as a way to trigger this video command. Below is my code in
MainActivity.kt
:

package com.example.camera

import androidx.appcompat.app.AppCompatActivity
import android.content.Intent
import android.os.Bundle
import android.view.View
import android.widget.EditText
import com.arthenica.mobileffmpeg.FFmpeg;



const val EXTRA_MESSAGE = "com.example.myfirstapp.MESSAGE"

class MainActivity : AppCompatActivity() {
 override fun onCreate(savedInstanceState: Bundle?) {
 super.onCreate(savedInstanceState)
 setContentView(R.layout.activity_main)
 }

 /** Called when the user taps the Send button */
 fun sendMessage(view: View) {
 val editText = findViewById<edittext>(R.id.editText)
 val message = editText.text.toString()
 val intent = Intent(this, DisplayMessageActivity::class.java).apply {
 putExtra(EXTRA_MESSAGE, message)
 }
 

 startActivity(intent)
 }
}
</edittext>


Is this the right file to add the command in ? If so, how do I add it ? For now, I just want the video to start recording after I click the button on the app.


I tried this :


val intent = Intent(FFmpeg.execute("-f android_camera -i 0:0 -r 30 -pixel_format bgr0 -t 00:00:05"))



but I got this error :


e: /Users/AndroidStudioProjects/camera/app/src/main/java/com/example/camera/MainActivity.kt: (27, 22): None of the following functions can be called with the arguments supplied: 
public constructor Intent(p0: Intent!) defined in android.content.Intent
public constructor Intent(p0: String!) defined in android.content.Intent



-
dnn : add a new interface DNNModel.get_output
11 septembre 2020, par Guo, Yejundnn : add a new interface DNNModel.get_output
for some cases (for example, super resolution), the DNN model changes
the frame size which impacts the filter behavior, so the filter needs
to know the out frame size at very beginning.Currently, the filter reuses DNNModule.execute_model to query the
out frame size, it is not clear from interface perspective, so add
a new explict interface DNNModel.get_output for such query. -
How to simultaneously capture mic, stream it to RTSP server and play it on iPhone's speaker ?
24 août 2021, par Norbert TowiańskiI want to capture sound from mic, stream it to RTSP server and play it simultaneously on iPhone's speaker after getting samples from RTSP server. I mean such kind of loop. I use FFMPEGKit and I want to use MobileVLCKit, but unfortunately microphone is off when I start play stream.
I think I've done first step (capturing from microphone and send OutputStream to RTSP server) :


@IBAction func transmitBtnPressed(_ sender: Any) {
 ffmpeg_transmit()
}

@IBAction func recordBtnPressed(_ sender: Any) {
 switch recordingState {
 case .idle:
 recordingState = .start
 startRecording()
 recordBtn.setTitle("Started", for: .normal)
 let urlToFile = URL(fileURLWithPath: outPipePath!)
 outputStream = OutputStream(url: urlToFile, append: false)
 outputStream!.open()
 case .capturing:
 recordingState = .end
 stopRecording()
 recordBtn.setTitle("End", for: .normal)
 default:
 break
 }
}

override func viewDidLoad() {
 super.viewDidLoad()
 outPipePath = FFmpegKitConfig.registerNewFFmpegPipe()
 self.setup()
}

override func viewDidAppear(_ animated: Bool) {
 super.viewDidAppear(animated)
 setUpAuthStatus()
}

func setUpAuthStatus() {
 if AVCaptureDevice.authorizationStatus(for: AVMediaType.audio) != .authorized {
 AVCaptureDevice.requestAccess(for: AVMediaType.audio, completionHandler: { (authorized) in
 DispatchQueue.main.async {
 if authorized {
 self.setup()
 }
 }
 })
 }
}

func setup() {
 self.session.sessionPreset = AVCaptureSession.Preset.high
 
 self.recordingURL = URL(fileURLWithPath: "\(NSTemporaryDirectory() as String)/file.m4a")
 if self.fileManager.isDeletableFile(atPath: self.recordingURL!.path) {
 _ = try? self.fileManager.removeItem(atPath: self.recordingURL!.path)
 }
 
 self.assetWriter = try? AVAssetWriter(outputURL: self.recordingURL!,
 fileType: AVFileType.m4a)
 self.assetWriter!.movieFragmentInterval = CMTime.invalid
 self.assetWriter!.shouldOptimizeForNetworkUse = true
 
 let audioSettings = [
 AVFormatIDKey: kAudioFormatLinearPCM,
 AVSampleRateKey: 48000.0,
 AVNumberOfChannelsKey: 1,
 AVLinearPCMIsFloatKey: false,
 AVLinearPCMBitDepthKey: 16,
 AVLinearPCMIsBigEndianKey: false,
 AVLinearPCMIsNonInterleaved: false,
 
 ] as [String : Any]
 
 
 self.audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio,
 outputSettings: audioSettings)
 
 self.audioInput?.expectsMediaDataInRealTime = true
 
 if self.assetWriter!.canAdd(self.audioInput!) {
 self.assetWriter?.add(self.audioInput!)
 }
 
 self.session.startRunning()
 
 DispatchQueue.main.async {
 self.session.beginConfiguration()
 
 self.session.commitConfiguration()
 
 let audioDevice = AVCaptureDevice.default(for: AVMediaType.audio)
 let audioIn = try? AVCaptureDeviceInput(device: audioDevice!)
 
 if self.session.canAddInput(audioIn!) {
 self.session.addInput(audioIn!)
 }
 
 if self.session.canAddOutput(self.audioOutput) {
 self.session.addOutput(self.audioOutput)
 }
 
 self.audioConnection = self.audioOutput.connection(with: AVMediaType.audio)
 }
}

func startRecording() {
 if self.assetWriter?.startWriting() != true {
 print("error: \(self.assetWriter?.error.debugDescription ?? "")")
 }
 
 self.audioOutput.setSampleBufferDelegate(self, queue: self.recordingQueue)
}

func stopRecording() {
 self.audioOutput.setSampleBufferDelegate(nil, queue: nil)
 
 self.assetWriter?.finishWriting {
 print("Saved in folder \(self.recordingURL!)")
 }
}
func captureOutput(_ captureOutput: AVCaptureOutput, didOutput
 sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
 
 if !self.isRecordingSessionStarted {
 let presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
 self.assetWriter?.startSession(atSourceTime: presentationTime)
 self.isRecordingSessionStarted = true
 recordingState = .capturing
 }
 
 var blockBuffer: CMBlockBuffer?
 var audioBufferList: AudioBufferList = AudioBufferList.init()
 
 CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, bufferListSizeNeededOut: nil, bufferListOut: &audioBufferList, bufferListSize: MemoryLayout<audiobufferlist>.size, blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment, blockBufferOut: &blockBuffer)
 let buffers = UnsafeMutableAudioBufferListPointer(&audioBufferList)
 
 for buffer in buffers {
 let u8ptr = buffer.mData!.assumingMemoryBound(to: UInt8.self)
 let output = outputStream!.write(u8ptr, maxLength: Int(buffer.mDataByteSize))
 
 if (output == -1) {
 let error = outputStream?.streamError
 print("\(#file) > \(#function) > Error on outputStream: \(error!.localizedDescription)")
 }
 else {
 print("\(#file) > \(#function) > Data sent")
 }
 }
}

func ffmpeg_transmit() {
 
 let cmd1: String = "-f s16le -ar 48000 -ac 1 -i "
 let cmd2: String = " -probesize 32 -analyzeduration 0 -c:a libopus -application lowdelay -ac 1 -ar 48000 -f rtsp -rtsp_transport udp rtsp://localhost:18556/mystream"
 let cmd = cmd1 + outPipePath! + cmd2
 
 print(cmd)
 
 ffmpegSession = FFmpegKit.executeAsync(cmd, withExecuteCallback: { ffmpegSession in
 
 let state = ffmpegSession?.getState()
 let returnCode = ffmpegSession?.getReturnCode()
 if let returnCode = returnCode, let get = ffmpegSession?.getFailStackTrace() {
 print("FFmpeg process exited with state \(String(describing: FFmpegKitConfig.sessionState(toString: state!))) and rc \(returnCode).\(get)")
 }
 }, withLogCallback: { log in
 
 }, withStatisticsCallback: { statistics in
 
 })
}
</audiobufferlist>


I want to use MobileVLCKit in that way :


func startStream(){
 guard let url = URL(string: "rtsp://localhost:18556/mystream") else {return}
 audioPlayer!.media = VLCMedia(url: url)

 audioPlayer!.media.addOption( "-vv")
 audioPlayer!.media.addOption( "--network-caching=10000")

 audioPlayer!.delegate = self
 audioPlayer!.audio.volume = 100

 audioPlayer!.play()

}



Could you give me some hints how to implement that ?