Recherche avancée

Médias (17)

Mot : - Tags -/wired

Autres articles (29)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

Sur d’autres sites (8240)

  • How to automate ffmpeg to split and merge parts of video, and keep the audio in sync ?

    9 décembre 2024, par Tree

    I have a Python script that automates trimming a large video (2 hours) into smaller segments and then concatenating them without re-encoding, to keep the process fast. The script runs these ffmpeg commands :

    


    import subprocess

# Extract chunks
segments = [(0, 300), (300, 600), (600, 900)]  # example segments in seconds
for i, (start, length) in enumerate(segments):
    subprocess.run([
        "ffmpeg", "-i", "input.mp4", "-ss", str(start), "-t", str(length),
        "-c", "copy", "-reset_timestamps", "1", "-y", f"chunk_{i}.mp4"
    ], check=True)

# Create concat list
with open("list.txt", "w") as f:
    for i in range(len(segments)):
        f.write(f"file 'chunk_{i}.mp4'\n")

# Concatenate
subprocess.run([
    "ffmpeg", "-f", "concat", "-safe", "0",
    "-i", "list.txt", "-c", "copy", "-y", "merged_output.mp4"
], check=True)


    


    All chunks come from the same source video, with identical codecs, resolution, and bitrate. Despite this, the final merged_output.mp4 sometimes has audio out of sync—especially after the first chunk.

    


    I’ve tried using -ss before -i to cut on keyframes, but the issue persists.

    


    Question : How can I ensure correct A/V sync in the final concatenated video when programmatically segmenting and merging via ffmpeg without fully re-encoding ? Is there a way to adjust the ffmpeg commands or process to avoid audio desynchronization ?

    


  • How to sync network audio with a different network video and play it with chewie

    26 mars 2023, par Rudra Sharma

    I am trying to stream a reddit videos on my app. For that reason I am using Reddit API but it is only giving the video url like 'redd.it/mpym0z9q8opa1/DASH_1080.mp4 ?source=fallback' with no audio but after some research I found out that we can get audio url by editing video url 'redd.it/mpym0z9q8opa1/DASH_audio.mp4 ?source=fallback'.

    


    Now I have both audio and video url with me how can I sync them on network and stream them on my app using chewie package (video player).

    


    This my code so far

    


    import &#x27;dart:async&#x27;;&#xA;import &#x27;dart:convert&#x27;;&#xA;import &#x27;package:http/http.dart&#x27; as http;&#xA;import &#x27;package:flutter/material.dart&#x27;;&#xA;import &#x27;package:video_player/video_player.dart&#x27;;&#xA;import &#x27;package:chewie/chewie.dart&#x27;;&#xA;&#xA;void main() => runApp(MyApp());&#xA;&#xA;class MyApp extends StatelessWidget {&#xA;  @override&#xA;  Widget build(BuildContext context) {&#xA;    return MaterialApp(&#xA;      title: &#x27;Reddit Videos&#x27;,&#xA;      theme: ThemeData(&#xA;        primarySwatch: Colors.blue,&#xA;        visualDensity: VisualDensity.adaptivePlatformDensity,&#xA;      ),&#xA;      home: VideoPlayerScreen(),&#xA;    );&#xA;  }&#xA;}&#xA;&#xA;class VideoPlayerScreen extends StatefulWidget {&#xA;  @override&#xA;  _VideoPlayerScreenState createState() => _VideoPlayerScreenState();&#xA;}&#xA;&#xA;class _VideoPlayerScreenState extends State<videoplayerscreen> {&#xA;  final List<string> _videoUrls = [];&#xA;&#xA;  @override&#xA;  void initState() {&#xA;    super.initState();&#xA;    _loadVideos();&#xA;  }&#xA;&#xA;  Future<void> _loadVideos() async {&#xA;    try {&#xA;      final videoUrls =&#xA;          await RedditApi.getVideoUrlsFromSubreddit(&#x27;aww&#x27;);&#xA;&#xA;      setState(() {&#xA;        _videoUrls.addAll(videoUrls);&#xA;      });&#xA;    } catch (e) {&#xA;      print(e);&#xA;    }&#xA;  }&#xA;&#xA;  @override&#xA;  Widget build(BuildContext context) {&#xA;    return Scaffold(&#xA;      appBar: AppBar(&#xA;        title: Text(&#x27;Reddit Videos&#x27;),&#xA;      ),&#xA;      body: SafeArea(&#xA;        child: _videoUrls.isNotEmpty&#xA;            ? _buildVideosList()&#xA;            : Center(child: CircularProgressIndicator()),&#xA;      ),&#xA;    );&#xA;  }&#xA;&#xA;  Widget _buildVideosList() {&#xA;    return ListView.builder(&#xA;      itemCount: _videoUrls.length,&#xA;      itemBuilder: (context, index) {&#xA;        return Padding(&#xA;          padding: const EdgeInsets.all(8.0),&#xA;          child: Chewie(&#xA;            controller: ChewieController(&#xA;              videoPlayerController: VideoPlayerController.network(&#xA;                _videoUrls[index],&#xA;              ),&#xA;              aspectRatio: 9 / 16,&#xA;              autoPlay: true,&#xA;              looping: true,&#xA;              autoInitialize: true,&#xA;            ),&#xA;          ),&#xA;        );&#xA;      },&#xA;    );&#xA;  }&#xA;}&#xA;&#xA;class RedditApi {&#xA;  static const String BASE_URL = &#x27;https://www.reddit.com&#x27;;&#xA;  static const String CLIENT_ID = &#x27;id&#x27;;&#xA;  static const String CLIENT_SECRET = &#x27;secret&#x27;;&#xA;&#xA;  static Future> getVideoUrlsFromSubreddit(&#xA;      String subredditName) async {&#xA;    final response = await http.get(&#xA;        Uri.parse(&#x27;$BASE_URL/r/$subredditName/top.json?limit=10&#x27;),&#xA;        headers: {&#x27;Authorization&#x27;: &#x27;Client-ID $CLIENT_ID&#x27;});&#xA;&#xA;    if (response.statusCode == 200) {&#xA;      final jsonData = jsonDecode(response.body);&#xA;      final postsData = jsonData[&#x27;data&#x27;][&#x27;children&#x27;];&#xA;&#xA;      final videoUrls = <string>[];&#xA;&#xA;      for (var postData in postsData) {&#xA;        if (postData[&#x27;data&#x27;][&#x27;is_video&#x27;]) {&#xA;          videoUrls.add(postData[&#x27;data&#x27;][&#x27;media&#x27;][&#x27;reddit_video&#x27;]&#xA;              [&#x27;fallback_url&#x27;]);&#xA;        }&#xA;      }&#xA;&#xA;      return videoUrls;&#xA;    } else {&#xA;      throw Exception("Failed to load videos from subreddit");&#xA;    }&#xA;  }&#xA;}&#xA;</string></void></string></videoplayerscreen>

    &#xA;

    I think the code is self explainatory about what I am trying to achieve (Trying to make a client for reddit).

    &#xA;

  • FFMPEG and JNI - pass AVFrame data to Java and Back

    17 octobre 2015, par tishu

    I have some C code that decodes a video frame by frame. I get to a point where i have an AVFrame in BGR32 and would like to send it back to Java for editing.

    I have a ByteBuffer object in my C code that was created in Java using AllocateDirect but i struggle to write the content of the AVFrame->data[0] (of uint8_t type) to it and read it back. I have tried memcpy with no luck. Does anyone have an idea how to achieve this ?

    UPDATE
    Followed Will’s comment below and wrote this in C

    char *buf = (*pEnv)->GetDirectBufferAddress(pEnv, byteBuffer);
    memcpy(buf, rgb_frame->data[0], output_width*output_height*4);

    The buffer does contain some data in Java but doing the following returns a null bitmap

    BufferedImage frame = ImageIO.read(bitmapStream);

    Where bitmapStream is a ByteBufferInputStream defined here :
    https://code.google.com/p/kryo/source/browse/trunk/src/com/esotericsoftware/kryo/io/ByteBufferInputStream.java?r=205

    Not sure if I am not writing things correctly in this buffer

    UPDATE 2

    Got pretty close now thanks to the latest snippet. I am using BGR32 format in my C code ie 4 bytes per pixel. So I modified things a bit in Java :

    final byte[] dataArray = new byte[width*height*4];
    imageData.get(dataArray);
    final BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_BGR);
    final DataBuffer buffer = new DataBufferByte(dataArray, dataArray.length);
    Raster raster = Raster.createRaster(sampleModel, buffer, null);
    image.setData(raster);

    I get the image correctly but there seems to be an issue with color channels
    Example

    Tried different formats with no luck