
Recherche avancée
Médias (2)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
Autres articles (100)
-
Automated installation script of MediaSPIP
25 avril 2011, parTo overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
The documentation of the use of this installation script is available here.
The code of this (...) -
Contribute to translation
13 avril 2011You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
MediaSPIP is currently available in French and English (...) -
Récupération d’informations sur le site maître à l’installation d’une instance
26 novembre 2010, parUtilité
Sur le site principal, une instance de mutualisation est définie par plusieurs choses : Les données dans la table spip_mutus ; Son logo ; Son auteur principal (id_admin dans la table spip_mutus correspondant à un id_auteur de la table spip_auteurs)qui sera le seul à pouvoir créer définitivement l’instance de mutualisation ;
Il peut donc être tout à fait judicieux de vouloir récupérer certaines de ces informations afin de compléter l’installation d’une instance pour, par exemple : récupérer le (...)
Sur d’autres sites (10616)
-
How to automate ffmpeg to split and merge parts of video, and keep the audio in sync ?
9 décembre 2024, par TreeI have a Python script that automates trimming a large video (2 hours) into smaller segments and then concatenating them without re-encoding, to keep the process fast. The script runs these ffmpeg commands :


import subprocess

# Extract chunks
segments = [(0, 300), (300, 600), (600, 900)] # example segments in seconds
for i, (start, length) in enumerate(segments):
 subprocess.run([
 "ffmpeg", "-i", "input.mp4", "-ss", str(start), "-t", str(length),
 "-c", "copy", "-reset_timestamps", "1", "-y", f"chunk_{i}.mp4"
 ], check=True)

# Create concat list
with open("list.txt", "w") as f:
 for i in range(len(segments)):
 f.write(f"file 'chunk_{i}.mp4'\n")

# Concatenate
subprocess.run([
 "ffmpeg", "-f", "concat", "-safe", "0",
 "-i", "list.txt", "-c", "copy", "-y", "merged_output.mp4"
], check=True)



All chunks come from the same source video, with identical codecs, resolution, and bitrate. Despite this, the final merged_output.mp4 sometimes has audio out of sync—especially after the first chunk.


I’ve tried using -ss before -i to cut on keyframes, but the issue persists.


Question : How can I ensure correct A/V sync in the final concatenated video when programmatically segmenting and merging via ffmpeg without fully re-encoding ? Is there a way to adjust the ffmpeg commands or process to avoid audio desynchronization ?


-
How to sync network audio with a different network video and play it with chewie
26 mars 2023, par Rudra SharmaI am trying to stream a reddit videos on my app. For that reason I am using Reddit API but it is only giving the video url like 'redd.it/mpym0z9q8opa1/DASH_1080.mp4 ?source=fallback' with no audio but after some research I found out that we can get audio url by editing video url 'redd.it/mpym0z9q8opa1/DASH_audio.mp4 ?source=fallback'.


Now I have both audio and video url with me how can I sync them on network and stream them on my app using chewie package (video player).


This my code so far


import 'dart:async';
import 'dart:convert';
import 'package:http/http.dart' as http;
import 'package:flutter/material.dart';
import 'package:video_player/video_player.dart';
import 'package:chewie/chewie.dart';

void main() => runApp(MyApp());

class MyApp extends StatelessWidget {
 @override
 Widget build(BuildContext context) {
 return MaterialApp(
 title: 'Reddit Videos',
 theme: ThemeData(
 primarySwatch: Colors.blue,
 visualDensity: VisualDensity.adaptivePlatformDensity,
 ),
 home: VideoPlayerScreen(),
 );
 }
}

class VideoPlayerScreen extends StatefulWidget {
 @override
 _VideoPlayerScreenState createState() => _VideoPlayerScreenState();
}

class _VideoPlayerScreenState extends State<videoplayerscreen> {
 final List<string> _videoUrls = [];

 @override
 void initState() {
 super.initState();
 _loadVideos();
 }

 Future<void> _loadVideos() async {
 try {
 final videoUrls =
 await RedditApi.getVideoUrlsFromSubreddit('aww');

 setState(() {
 _videoUrls.addAll(videoUrls);
 });
 } catch (e) {
 print(e);
 }
 }

 @override
 Widget build(BuildContext context) {
 return Scaffold(
 appBar: AppBar(
 title: Text('Reddit Videos'),
 ),
 body: SafeArea(
 child: _videoUrls.isNotEmpty
 ? _buildVideosList()
 : Center(child: CircularProgressIndicator()),
 ),
 );
 }

 Widget _buildVideosList() {
 return ListView.builder(
 itemCount: _videoUrls.length,
 itemBuilder: (context, index) {
 return Padding(
 padding: const EdgeInsets.all(8.0),
 child: Chewie(
 controller: ChewieController(
 videoPlayerController: VideoPlayerController.network(
 _videoUrls[index],
 ),
 aspectRatio: 9 / 16,
 autoPlay: true,
 looping: true,
 autoInitialize: true,
 ),
 ),
 );
 },
 );
 }
}

class RedditApi {
 static const String BASE_URL = 'https://www.reddit.com';
 static const String CLIENT_ID = 'id';
 static const String CLIENT_SECRET = 'secret';

 static Future> getVideoUrlsFromSubreddit(
 String subredditName) async {
 final response = await http.get(
 Uri.parse('$BASE_URL/r/$subredditName/top.json?limit=10'),
 headers: {'Authorization': 'Client-ID $CLIENT_ID'});

 if (response.statusCode == 200) {
 final jsonData = jsonDecode(response.body);
 final postsData = jsonData['data']['children'];

 final videoUrls = <string>[];

 for (var postData in postsData) {
 if (postData['data']['is_video']) {
 videoUrls.add(postData['data']['media']['reddit_video']
 ['fallback_url']);
 }
 }

 return videoUrls;
 } else {
 throw Exception("Failed to load videos from subreddit");
 }
 }
}
</string></void></string></videoplayerscreen>


I think the code is self explainatory about what I am trying to achieve (Trying to make a client for reddit).


-
FFMPEG and JNI - pass AVFrame data to Java and Back
17 octobre 2015, par tishuI have some C code that decodes a video frame by frame. I get to a point where i have an AVFrame in BGR32 and would like to send it back to Java for editing.
I have a ByteBuffer object in my C code that was created in Java using AllocateDirect but i struggle to write the content of the AVFrame->data[0] (of uint8_t type) to it and read it back. I have tried memcpy with no luck. Does anyone have an idea how to achieve this ?
UPDATE
Followed Will’s comment below and wrote this in Cchar *buf = (*pEnv)->GetDirectBufferAddress(pEnv, byteBuffer);
memcpy(buf, rgb_frame->data[0], output_width*output_height*4);The buffer does contain some data in Java but doing the following returns a null bitmap
BufferedImage frame = ImageIO.read(bitmapStream);
Where bitmapStream is a ByteBufferInputStream defined here :
https://code.google.com/p/kryo/source/browse/trunk/src/com/esotericsoftware/kryo/io/ByteBufferInputStream.java?r=205Not sure if I am not writing things correctly in this buffer
UPDATE 2
Got pretty close now thanks to the latest snippet. I am using BGR32 format in my C code ie 4 bytes per pixel. So I modified things a bit in Java :
final byte[] dataArray = new byte[width*height*4];
imageData.get(dataArray);
final BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_BGR);
final DataBuffer buffer = new DataBufferByte(dataArray, dataArray.length);
Raster raster = Raster.createRaster(sampleModel, buffer, null);
image.setData(raster);I get the image correctly but there seems to be an issue with color channels
Tried different formats with no luck