
Recherche avancée
Médias (2)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (77)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Récupération d’informations sur le site maître à l’installation d’une instance
26 novembre 2010, parUtilité
Sur le site principal, une instance de mutualisation est définie par plusieurs choses : Les données dans la table spip_mutus ; Son logo ; Son auteur principal (id_admin dans la table spip_mutus correspondant à un id_auteur de la table spip_auteurs)qui sera le seul à pouvoir créer définitivement l’instance de mutualisation ;
Il peut donc être tout à fait judicieux de vouloir récupérer certaines de ces informations afin de compléter l’installation d’une instance pour, par exemple : récupérer le (...) -
Pas question de marché, de cloud etc...
10 avril 2011Le vocabulaire utilisé sur ce site essaie d’éviter toute référence à la mode qui fleurit allègrement
sur le web 2.0 et dans les entreprises qui en vivent.
Vous êtes donc invité à bannir l’utilisation des termes "Brand", "Cloud", "Marché" etc...
Notre motivation est avant tout de créer un outil simple, accessible à pour tout le monde, favorisant
le partage de créations sur Internet et permettant aux auteurs de garder une autonomie optimale.
Aucun "contrat Gold ou Premium" n’est donc prévu, aucun (...)
Sur d’autres sites (6937)
-
Reading h264 or h265 stream with ffmpeg/OpenCV : Which is faster ?
25 novembre 2020, par suckssI'm using OpenCV with ffmpeg support to read a RTSP stream coming from an IP camera and then to write the frames to a video. The problem is that the frame size is 2816x2816 at 20 fps i.e. there's a lot of data coming in.


I noticed that there was a significant delay in the stream, so I set the buffer size of the
cv::VideoCapture
object to 1, because I thought that the frames might just get stuck in the buffer instead of being grabbed and processed. This however just caused for frames to be dropped instead.

My next move was to experiment a bit with the frame size/fps and the encoding of the video that I'm writing. All of those things helped to improve the situation, but in the long run I still have to use a frame size of 2816x2816 and support up to 20 fps, so I can't set it lower sadly.


That's where my question comes in : given the fact that the camera stream is going to be either h264 or h265, which one would be read faster by the
cv::VideoCapture
object ? And how should I encode the video I'm writing in order to minimize the time spent decoding/encoding frames ?

That's the code I'm using for reference :


using namespace cv;
int main(int argc, char** argv)
{
 VideoCapture cap;
 cap.set(CAP_PROP_BUFFERSIZE, 1); // internal buffer will now store only 1 frames

 if (!cap.open("rtsp://admin:admin@1.1.1.1:554/stream")) {
 return -1;
 }
 VideoWriter videoWr;
 Mat frame;
 cap >> frame;
 //int x264 = cv::VideoWriter::fourcc('x', '2', '6', '4'); //I was trying different options
 int x264 = cv::VideoWriter::fourcc('M', 'J', 'P', 'G');
 videoWr = cv::VideoWriter("test_video.avi", 0, 0, 20, frame.size(), true);

 namedWindow("test", WINDOW_NORMAL);
 cv::resizeWindow("test", 1024, 768);
 
 for (;;)
 {
 cap >> frame;
 if (frame.empty()) break; // end of video stream
 
 imshow("test", frame);
 if (waitKey(10) == 27) break; 
 videoWr << frame;
 
 }

 return 0;
}



-
Reading JPEG in ffmpeg
16 juillet 2021, par Paul LammertsmaI'm trying to get ffmpeg to encode several individual JPEG images into a video on Android. I've successfully built it for Android (see the configuration string at the end of this post).



I can encode an h.263+ video with randomly generated frame content, and ffmpeg otherwise appears to work well.



A similar question suggests that the following code should be sufficient to load an image into an
AvFrame
:


// Make sure we have the codecs
av_register_all();

AVFormatContext *pFormatCtx;
int ret = av_open_input_file(&pFormatCtx, imageFileName, NULL, 0, NULL);

if (ret != 0) {
 printf("Can't open image file '%s': code %d, %s",
 imageFileName, ret, strerror(AVERROR(ret)));
}




The above returns the correct absolute file path and error :





Failed '/sdcard/DCIM/Camera/IMG083.jpg' : code -1094995529, Unknown error : 1094995529





Incidentally, if I omit
av_register_all()
, it returns with error 2.


I've compiled ffmpeg with the following arguments :







./configure —target-os=linux 
 —prefix=$PREFIX 
 —enable-cross-compile 
 —extra-libs="-lgcc" 
 —arch=arm 
 —cc=$PREBUILT/bin/arm-linux-androideabi-gcc 
 —cross-prefix=$PREBUILT/bin/arm-linux-androideabi- 
 —nm=$PREBUILT/bin/arm-linux-androideabi-nm 
 —sysroot=$PLATFORM 
 —extra-cflags=" -O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 $OPTIMIZE_CFLAGS " 
 —enable-shared 
 —enable-static 
 —extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog" 
 —disable-everything 
 —enable-demuxer=mov 
 —enable-demuxer=h264 
 —disable-ffplay 
 —enable-protocol=file 
 —enable-avformat 
 —enable-avcodec 
 —enable-decoder=mjpeg 
 —enable-decoder=png 
 —enable-parser=h264 
 —enable-encoder=h263 
 —enable-encoder=h263p 
 —disable-network 
 —enable-zlib 
 —disable-avfilter 
 —disable-avdevice







Any suggestions would be most welcome !


-
Where do I find the Saved Image in Android ?
29 octobre 2016, par FirstStepExcuse me, quick question :
I have this routine of video stream, where I receive packets, convert them to byte[] ,then to bitmaps, then display them on the screen :
dsocket.receive(packetReceived); // receive packet
byte[] buff = packetReceived.getData(); // convert packet to byte[]
final Bitmap ReceivedImage = BitmapFactory.decodeByteArray(buff, 0, buff.length); // convert byte[] to bitmap image
runOnUiThread(new Runnable()
{
@Override
public void run()
{
// this is executed on the main (UI) thread
imageView.setImageBitmap(ReceivedImage);
}
});Now, I want to implementing a Recording feature. Suggestions say I need to use FFmpeg (which I have no idea how) but first, I need to prepare a directory of the ordered images to then convert it to a video file. Do do that I will save all the images internally, and I am using this answer to save each image :
if(RecordVideo && !PauseRecording) {
saveToInternalStorage(ReceivedImage, ImageNumber);
ImageNumber++;
}
else
{
if(!RecordVideo)
ImageNumber = 0;
}
// ...
private void saveToInternalStorage(Bitmap bitmapImage, int counter){
ContextWrapper cw = new ContextWrapper(getApplicationContext());
// path to /data/data/yourapp/app_data/imageDir
File MyDirectory = cw.getDir("imageDir", Context.MODE_PRIVATE);
// Create imageDir
File MyPath = new File(MyDirectory,"Image" + counter + ".jpg");
FileOutputStream fos = null;
try {
fos = new FileOutputStream(MyPath);
// Use the compress method on the BitMap object to write image to the OutputStream
bitmapImage.compress(Bitmap.CompressFormat.PNG, 100, fos);
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
fos.close();
} catch (IOException e) {
e.printStackTrace();
}
}
//return MyDirectory.getAbsolutePath();
}But I can’t seem to find the directory on my device (To actually see if I succeeded in creating the directory) Where is
// path to /data/data/yourapp/app_data/imageDir
located exactly ?