
Recherche avancée
Autres articles (57)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Mise à disposition des fichiers
14 avril 2011, parPar défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...) -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)
Sur d’autres sites (5761)
-
configure : Don’t do enable_deep_weak on disabled variables
3 avril 2013, par Martin Storsjöconfigure : Don’t do enable_deep_weak on disabled variables
This avoids cases where configure tries to weakly enable an item
which actually is disabled, ending up still enabling dependencies
of the item which itself is only enabled weakly.More concretely, the h264 decoder suggests error resilience, which
is then enabled weakly (unless manually disabled). Previously,
dsputil, which is a dependency of error resilience, was enabled
even if error resilience wasn’t enabled in the end.Signed-off-by : Martin Storsjö <martin@martin.st>
-
Android FFMPEG command line for video filter
4 février 2014, par user2568369I am using following command to add retro effect in video.
String[] ffmpegCommand = {"/data/data/com.mobvcasting.mjpegffmpeg/ffmpeg", "-r", ""+p.getPreviewFrameRate(), "-b", "1000000", "-vcodec", "mjpeg", "-i", Environment.getExternalStorageDirectory().getPath() + "/com.mobvcasting.mjpegffmpeg/frame_%05d.jpg","-vcodec", "mjpeg", "-acodec","libfaac","-vf","curves=vintage", "-qscale", "3", "-async", "1", "-y",Environment.getExternalStorageDirectory().getPath() + "/com.mobvcasting.mjpegffmpeg/video.mp4"};
ffmpegProcess = new ProcessBuilder(ffmpegCommand).redirectErrorStream(true).start();
BufferedReader reader = new BufferedReader(new InputStreamReader(ffmpegProcess.getInputStream()));but I am getting following error :
09-02 13:42:57.343: V/MJPEG_FFMPEG(2346): Finished Writing Frame
09-02 13:42:57.351: V/MJPEG_FFMPEG(2346): Recording Stopped
09-02 13:42:57.414: V/MJPEG_FFMPEG(2346): ***Starting FFMPEG***
09-02 13:42:57.460: V/MJPEG_FFMPEG(2346): ***FFmpeg version UNKNOWN, Copyright (c) 2000-2010 the FFmpeg developers***
09-02 13:42:57.460: V/MJPEG_FFMPEG(2346): *** built on Jul 28 2011 16:47:07 with gcc 4.4.3***
09-02 13:42:57.460: V/MJPEG_FFMPEG(2346): *** configuration: --target-os=linux --cross-prefix=arm-linux-androideabi- --arch=arm --sysroot=/Developer/android-ndk-r5b//platforms/android-3/arch-arm --soname-prefix=/data/data/com.mobvcasting.mjpegffmpeg/ --enable-shared --disable-symver --enable-small --optimization-flags=-O2 --enable-encoder=mpeg2video --enable-encoder=nellymoser --enable-protocol=file --prefix=../build/ffmpeg/armeabi --extra-cflags= --extra-ldflags=***
09-02 13:42:57.460: V/MJPEG_FFMPEG(2346): *** libavutil 50.34. 0 / 50.34. 0***
09-02 13:42:57.460: V/MJPEG_FFMPEG(2346): *** libavcore 0.16. 0 / 0.16. 0***
09-02 13:42:57.460: V/MJPEG_FFMPEG(2346): *** libavcodec 52.99. 1 / 52.99. 1***
09-02 13:42:57.460: V/MJPEG_FFMPEG(2346): *** libavformat 52.88. 0 / 52.88. 0***
09-02 13:42:57.460: V/MJPEG_FFMPEG(2346): *** libavdevice 52. 2. 2 / 52. 2. 2***
09-02 13:42:57.460: V/MJPEG_FFMPEG(2346): *** libavfilter 1.69. 0 / 1.69. 0***
09-02 13:42:57.460: V/MJPEG_FFMPEG(2346): *** libswscale 0.12. 0 / 0.12. 0***
09-02 13:42:57.648: V/MJPEG_FFMPEG(2346): ***Input #0, image2, from '/mnt/sdcard/com.mobvcasting.mjpegffmpeg/frame_%05d.jpg':***
09-02 13:42:57.656: V/MJPEG_FFMPEG(2346): *** Duration: 00:00:01.83, start: 0.000000, bitrate: N/A***
09-02 13:42:57.656: V/MJPEG_FFMPEG(2346): *** Stream #0.0: Video: mjpeg, yuvj420p, 320x240 [PAR 1:1 DAR 4:3], 30 fps, 30 tbr, 30 tbn, 30 tbc***
09-02 13:42:57.695: V/MJPEG_FFMPEG(2346): ***[buffer @ 0x5d110] w:320 h:240 pixfmt:yuvj420p***
09-02 13:42:57.695: V/MJPEG_FFMPEG(2346): ***No such filter: 'curves'***
09-02 13:42:57.695: V/MJPEG_FFMPEG(2346): ***Error opening filters!***
09-02 13:42:57.695: V/MJPEG_FFMPEG(2346): ***Ending FFMPEG***Any help will be highly appreciated.
-
Updating SDL yuv Texture
15 juin 2015, par madprogrammer2015I am receiving an H.264 video stream and successfully decoding it with FFMPEG. It can display the first frame of data but then after that the screen never updates. It just appears to become a static image. I am using YUV pixel format, and I am receiving it in that format as well. Also I am using SDL_UpdateYUVTexture().
Here is my code :
int main()
{
WORD wVersionRequested;
WSADATA wsaData;
int wsaerr;
if (SDL_Init(SDL_INIT_EVERYTHING)) {
fprintf(stderr, "Could not initialize SDL - %s\n", SDL_GetError());
exit(1);
}
// Using MAKEWORD macro, Winsock version request 2.2
wVersionRequested = MAKEWORD(2, 2);
wsaerr = WSAStartup(wVersionRequested, &wsaData);
if (wsaerr != 0)
{
/* Tell the user that we could not find a usable */
/* WinSock DLL.*/
printf("The Winsock dll not found!\n");
return 0;
}
else
{
printf("The Winsock dll found!\n");
printf("The status: %s.\n", wsaData.szSystemStatus);
}
/* Confirm that the WinSock DLL supports 2.2.*/
/* Note that if the DLL supports versions greater */
/* than 2.2 in addition to 2.2, it will still return */
/* 2.2 in wVersion since that is the version we */
/* requested. */
if (LOBYTE(wsaData.wVersion) != 2 || HIBYTE(wsaData.wVersion) != 2)
{
/* Tell the user that we could not find a usable */
/* WinSock DLL.*/
printf("The dll do not support the Winsock version %u.%u!\n", LOBYTE(wsaData.wVersion), HIBYTE(wsaData.wVersion));
WSACleanup();
return 0;
}
else
{
printf("The dll supports the Winsock version %u.%u!\n", LOBYTE(wsaData.wVersion), HIBYTE(wsaData.wVersion));
printf("The highest version this dll can support: %u.%u\n", LOBYTE(wsaData.wHighVersion), HIBYTE(wsaData.wHighVersion));
}
ULONG localif;
/*INT Ret;
HANDLE ThreadHandle;
DWORD ThreadId;
WSAEVENT AcceptEvent;
char buf[1024];
int buflen = 1024, rc, err;*/
SOCKET s;
SOCKET ns;
SOCKADDR_IN multi, safrom;
int fromlen;
int totalSize = 0;
AVCodec *codec;
AVCodecContext *codecContext;
int frame;
int got_picture;
AVFrame *picture;
AVPacket packet;
SwsContext* convertContext;
uint16_t i = 1;
//std::queue<madproto> queue;
//std::list<madproto> list;
AVCodecParserContext *parser;
std::vector buffer;
//moodycamel::ConcurrentQueue<madproto> protoQueue;
SDL_Window *window;
SDL_Renderer *renderer;
SDL_Texture *bmp;
SDL_Rect rect;
file.open("log.txt");
s = socket(AF_INET, SOCK_STREAM, IPPROTO_RM);
multi.sin_family = AF_INET;
multi.sin_port = htons(5150);
multi.sin_addr.s_addr = inet_addr("234.5.6.7");
int bindResult = bind(s, (PSOCKADDR)&multi, sizeof(multi));
if (bindResult < 0)
{
std::cout << "bindResult: " << WSAGetLastError() << std::endl;
}
listen(s, 10);
//if ((AcceptEvent = WSACreateEvent()) == WSA_INVALID_EVENT)
//{
// printf("WSACreateEvent() failed with error %d\n", WSAGetLastError());
// return 1;
//}
//else
// printf("WSACreateEvent() is OK!\n");
//// Create a worker thread to service completed I/O requests
//if ((ThreadHandle = CreateThread(NULL, 0, WorkerThread, (LPVOID)AcceptEvent, 0, &ThreadId)) == NULL)
//{
// printf("CreateThread() failed with error %d\n", GetLastError());
// return 1;
//}
//else
// printf("CreateThread() should be fine!\n");
localif = inet_addr("192.168.1.2");
setsockopt(s, IPPROTO_RM, RM_ADD_RECEIVE_IF, (char *)&localif, sizeof(localif));
fromlen = sizeof(safrom);
ns = accept(s, (SOCKADDR *)&safrom, &fromlen);
closesocket(s); // Don't need to listen anymore
std::string received;
av_register_all();
int horizontal = 0;
int vertical = 0;
GetDesktopResolution(horizontal, vertical);
codec = avcodec_find_decoder(CODEC_ID_H264);
if (!codec) {
std::cout << "codec not found" << std::endl;
std::cin.get();
}
codecContext = avcodec_alloc_context3(codec);
/*if (codec->capabilities & CODEC_CAP_TRUNCATED)
codecContext->flags |= CODEC_FLAG_TRUNCATED;*/
//codecContext->flags |= CODEC_FLAG_LOW_DELAY;
codecContext->flags2 |= CODEC_FLAG2_CHUNKS;
codecContext->width = horizontal;
codecContext->height = vertical;
codecContext->codec_id = CODEC_ID_H264;
codecContext->codec_type = AVMEDIA_TYPE_VIDEO;
codecContext->pix_fmt = PIX_FMT_YUV420P;
codecContext->thread_type = 0;
if (avcodec_open2(codecContext, codec, NULL) < 0) {
std::cout << "could not open codec" << std::endl;
std::cin.get();
}
convertContext = sws_getContext(
codecContext->width,
codecContext->height,
PIX_FMT_RGB32,
codecContext->width,
codecContext->height,
PIX_FMT_YUV420P,
SWS_BICUBIC,
NULL,
NULL,
NULL
);
parser = av_parser_init(CODEC_ID_H264);
picture = av_frame_alloc();
if (ns == INVALID_SOCKET)
{
std::cout << "accept didn't work!" << std::endl;
std::cin.get();
}
/*if (WSASetEvent(AcceptEvent) == FALSE)
{
printf("WSASetEvent() failed with error %d\n", WSAGetLastError());
return 1;
}
else
printf("WSASetEvent() should be working!\n");*/
window = SDL_CreateWindow("YUV", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, codecContext->width, codecContext->height, SDL_WINDOW_SHOWN);
renderer = SDL_CreateRenderer(window, -1, 0);
bmp = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_IYUV, SDL_TEXTUREACCESS_STREAMING, codecContext->width, codecContext->height);
//receive = SDL_CreateThread(receiveThread, "ReceiveThread", (void *)NULL);
bool quit = false;
rect.x = 0;
rect.y = 0;
rect.w = codecContext->width;
rect.h = codecContext->height;
while (!quit)
{
while (true)
{
MadProto proto;
int result = recvfrom(ns, (char *)&proto, sizeof(MadProto), 0, (struct sockaddr *)&multi, &fromlen);
if (result < 0)
{
std::cout << "receive failed! error: " << WSAGetLastError() << std::endl;
break;
}
else
{
std::cout << "receive successful, received " << result << " bytes" << std::endl;
if (ntohs(proto.frame_end) == 1)
{
uint8_t *outbuffer = NULL;
int outBufSize = 0;
int rc = av_parser_parse2(parser, codecContext, &outbuffer, &outBufSize, buffer.data(), buffer.size(), 0, 0, 0);
if (outBufSize <= 0)
{
std::cout << "parsing failed!" << std::endl;
std::cout << "outBufSize: " << outBufSize << std::endl;
break;
}
if (rc)
{
std::cout << "rc: " << rc << std::endl;
std::cout << "parsing successful!" << std::endl;
//std::cin.get();
av_init_packet(&packet);
packet.size = outBufSize;
packet.data = outbuffer;
frame = avcodec_decode_video2(codecContext, picture, &got_picture, &packet);
if (frame < 0)
{
std::cout << "decoding was unsuccessful!" << std::endl;
break;
}
if (got_picture)
{
std::cout << "decoding was successful!" << std::endl;
std::cout << "decoded length was: " << frame << std::endl;
buffer.empty();
//std::cin.get();
int code = SDL_UpdateYUVTexture(bmp, NULL, picture->data[0], picture->linesize[0],
picture->data[1], picture->linesize[1],
picture->data[2], picture->linesize[2]);
if (code < 0)
{
std::cout << "unable to update texture " << SDL_GetError() << std::endl;
std::cin.get();
}
code = SDL_RenderClear(renderer);
if (code < 0)
{
std::cout << "renderer clear failed " << SDL_GetError() << std::endl;
std::cin.get();
}
code = SDL_RenderCopy(renderer, bmp, NULL, &rect);
if (code < 0)
{
std::cout << "renderer copy failed " << SDL_GetError() << std::endl;
std::cin.get();
}
SDL_RenderPresent(renderer);
SDL_Delay(40);
}
av_free_packet(&packet);
}
}
else
{
std::copy(proto.payload, proto.payload + ntohs(proto.nal_length), std::back_inserter(buffer));
std::cout << "frame is continuing!" << std::endl;
//queue.push(proto);
//list.push_front(proto);
}
}
}
SDL_WaitEvent(&event);
switch (event.type)
{
case SDL_QUIT:
quit = true;
break;
}
}
std::cout << "closing everything!" << std::endl;
av_frame_free(&picture);
closesocket(ns);
fclose(f);
std::cin.get();
return 0;
}
</madproto></madproto></madproto>