
Recherche avancée
Médias (91)
-
Spitfire Parade - Crisis
15 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Wired NextMusic
14 mai 2011, par
Mis à jour : Février 2012
Langue : English
Type : Video
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
-
Sintel MP4 Surround 5.1 Full
13 mai 2011, par
Mis à jour : Février 2012
Langue : English
Type : Video
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (110)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
Sur d’autres sites (10683)
-
Android merging PCM and MP3 into AAC
4 février 2019, par katarotyI have a program that records user input sound as
PCM
(I needed to do it separately to "play" with the voice) and then I also have a custom audio track, which is inMP3
format, that I want merge with thePCM
file.To start off I convert both of them to
WAV
separately, then I combine the 2WAV
files, and finally convert the result toAAC
because I also need to merge the audio with video later.I tried merging 2
AAC
files, but that did not work out for me.For audio conversion I am using FFmpeg-Android.
The problem is that it takes way too long, around 1-2min to do the whole conversion and because of that I need a new way to do it all. I have looked into other libraries but this was the only one I could get to work.
Can someone recommend something that would do the whole process faster ?
Here is my code to merge all of the files :
public class AudioProcessor {
private Context context;
private FFmpeg ffmpeg;
private AudioProcessorListener listener;
private File micPcmFile;
private File backgroundMp3File;
private File pcmtowavTempFile;
private File mp3towavTempFile;
private File combinedwavTempFile;
private File outputFile;
private File volumeChangedTempFile;
private FFtask currentTask;
private int videoRecordingLength = 0;
TextView extensionDownload, percentProgress;
private static final String TAG = "FFMPEG AV Processor";
public AudioProcessor(Context context, Activity activity) {
ffmpeg = null;
ffmpeg = FFmpeg.getInstance(context);
percentProgress = activity.findViewById(R.id.percentProgress);
percentProgress.setSingleLine(false);
this.context = context;
prepare();
}
/**
* Program main method. Starts running program
* @throws Exception
*/
public void process() throws Exception {
if (!ffmpeg.isSupported()) {
Log.e(TAG, "FFMPEG not supported! Cannot convert audio!");
throw new RuntimeException("FFMPeg has to be supported");
}
if (!checkIfAllFilesPresent()) {
Log.e(TAG, "All files are not set yet. Please set file first");
throw new RuntimeException("Files are not set!");
}
Log.e(TAG, "Start processing audio!");
listener.onStart();
Handler handler = new Handler();
handler.postDelayed(new Runnable() {
@Override
public void run() {
convertPCMToWav();
}
}, 200);
}
/**
* Prepares program
*/
private void prepare() {
Log.d(TAG, "Preparing everything...");
prepareTempFiles();
}
/**
* Converts PCM to wav file. Automatically create new file.
*/
private void convertPCMToWav() {
Log.d(TAG, "Convert PCM TO Wav");
//ffmpeg -f s16le -ar 44.1k -ac 2 -i file.pcm file.wav
String[] cmd = { "-f" , "s16le", "-ar", "44.1k", "-i", micPcmFile.toString(), "-y", pcmtowavTempFile.toString()};
currentTask = ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {
super.onStart();
percentProgress.setVisibility(View.VISIBLE);
percentProgress.setText("Converting your recording\n"+"1/5");
}
@Override
public void onSuccess(String message) {
super.onSuccess(message);
convertMP3ToWav();
}
@Override
public void onFailure(String message) {
super.onFailure(message);
onError(message);
convertPCMToWav();
}
});
}
/**
* Converts mp3 file to wav file.
* Automatically creates Wav file
*/
private void convertMP3ToWav() {
Log.e(TAG, "Convert MP3 TO Wav");
//ffmpeg -ss 0 -t 30 -i file.mp3 file.wav
//String[] cmd = { "-ss", "0", "-t", Integer.toString(videoRecordingLength), "-i" , backgroundMp3File.toString(), "-y", mp3towavTempFile.toString() };
String[] cmd = { "-i" , backgroundMp3File.toString(), "-y", mp3towavTempFile.toString() };
currentTask = ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {
super.onStart();
percentProgress.setText("Converting background audio\n"+"2/5");
Log.d(TAG, "Convert MP3 TO Wav");
}
@Override
public void onSuccess(String message) {
super.onSuccess(message);
changeMicAudio();
}
@Override
public void onFailure(String message) {
super.onFailure(message);
Log.e(TAG, "Failed to convert MP3 TO Wav");
onError(message);
throw new RuntimeException("Failed to convert MP3 TO Wav");
}
});
}
/**
* Combines 2 wav files into one wav file. Overlays audio
*/
private void combineWavs() {
Log.e(TAG, "Combine wavs");
//ffmpeg -i C:\Users\VR1\Desktop\_mp3.wav -i C:\Users\VR1\Desktop\_pcm.wav -filter_complex amix=inputs=2:duration=first:dropout_transition=3 C:\Users\VR1\Desktop\out.wav
String[] cmd = { "-i" , pcmtowavTempFile.toString(), "-i", volumeChangedTempFile.toString(), "-filter_complex", "amix=inputs=2:duration=first:dropout_transition=3", "-y",combinedwavTempFile.toString()};
currentTask = ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {
super.onStart();
percentProgress.setText("Combining the two audio files\n"+"4/5");
}
@Override
public void onSuccess(String message) {
super.onSuccess(message);
encodeWavToAAC();
}
@Override
public void onFailure(String message) {
super.onFailure(message);
onError(message);
}
});
}
private void changeMicAudio(){
Log.e(TAG, "Change audio volume");
//ffmpeg -i input.wav -filter:a "volume=1.5" output.wav
String[] cmdy = { "-i", mp3towavTempFile.toString(), "-af", "volume=0.9", "-y",volumeChangedTempFile.toString()};
currentTask = ffmpeg.execute(cmdy, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {
super.onStart();
percentProgress.setText("Normalizing volume\n"+"3/5");
}
@Override
public void onSuccess(String message) {
combineWavs();
super.onSuccess(message);
}
@Override
public void onFailure(String message) {
super.onFailure(message);
Log.e("AudioProcessor", message);
}
});
}
/**
* Do something on error. Releases program data (deletes files)
* @param message
*/
private void onError(String message) {
completed();
if (listener != null) {
//listener.onError(message);
}
}
/**
* Encode to AAC
*/
private void encodeWavToAAC() {
Log.d(TAG, "Encode Wav file to AAC");
//ffmpeg -i file.wav -c:a aac -b:a 128k -f adts output.m4a
String[] cmd = { "-i" , combinedwavTempFile.toString(), "-c:a", "aac", "-b:a", "128k", "-f", "adts", "-y",outputFile.toString()};
currentTask = ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {
super.onStart();
percentProgress.setText("Normalizing volume\n"+"3/5");
}
@Override
public void onSuccess(String message) {
super.onSuccess(message);
if (listener != null) {
listener.onSuccess(outputFile);
}
completed();
}
@Override
public void onFailure(String message) {
super.onFailure(message);
onError(message);
encodeWavToAAC();
}
});
}
/**
* Uninitializes class
*/
private void completed() {
if (listener != null) {
listener.onFinish();
}
Log.d(TAG, "Process completed successfully!");
destroyTempFiles();
}
/**
* Prepares temp required files by deleteing them if they exsist.
* Files cannot exists before ffmpeg actions. FFMpeg automatically creates those files.
*/
private void prepareTempFiles() {
Log.d(TAG, "Preparing Temp files...");
pcmtowavTempFile = new File(context.getFilesDir()+ Common.TEMP_LOCAL_DIR + "/" + "_pcm.wav");
mp3towavTempFile = new File(context.getFilesDir()+ Common.TEMP_LOCAL_DIR + "/" + "_mp3.wav");
combinedwavTempFile = new File(context.getFilesDir()+ Common.TEMP_LOCAL_DIR + "/" + "_combined.wav");
volumeChangedTempFile = new File(context.getFilesDir()+ Common.TEMP_LOCAL_DIR + "/" + "_volumeChanged.wav");
}
/**
* Destroys temp required files
*/
private void destroyTempFiles() {
Log.d(TAG, "Destroying Temp files...");
pcmtowavTempFile.delete();
mp3towavTempFile.delete();
combinedwavTempFile.delete();
volumeChangedTempFile.delete();
Log.d(TAG, "Destroying files completed!");
}
/**
* Checks if all files are set, so we can process them
* @return - all files ready
*/
private boolean checkIfAllFilesPresent() {
if(micPcmFile == null || backgroundMp3File == null || outputFile == null) {
Log.e(TAG, "All files are not set! Set all files!");
throw new RuntimeException("Output file is not present!");
}
Log.d(TAG, "All files are present!");
return true;
}
public void setOutputFile(File outputFile) {
this.outputFile = outputFile;
}
public void setListener(AudioProcessorListener listener) {
this.listener = listener;
}
public void setMicPcmFile(File micPcmFile) {
this.micPcmFile = micPcmFile;
}
public void setBackgroundMp3File(File backgroundMp3File) {
this.backgroundMp3File = backgroundMp3File;
}
public void setVideoRecordingLength(int seconds) {
this.videoRecordingLength = seconds;
}
/**
* Quits current processing ffmpeg task
*/
public void killCurrentTask() {
if (currentTask != null) {
currentTask.killRunningProcess();
}
}
public interface AudioProcessorListener {
void onStart();
void onSuccess(File output);
void onError(String message);
void onFinish();
}
} -
FFmpeg Processes keep running even after finishAndRemoveTask()
9 octobre 2018, par katarotyI have a program that gets
Audio files
and converts them usingBravobit FFmpeg
Bravobit FFmpeg github.It works as intended until someone minimizes screen during the convertion process. I am trying to fix that but so far I have not been successful.
This part is in the
Main Activity
that creates and starts myAudioProcessor
:AudioProcessor ac = new AudioProcessor(getApplicationContext(), PostNewActivity.this);
ac.setBackgroundMp3File(backgroundAudio);
ac.setMicPcmFile(micOutputPCM);
ac.setOutputFile(tempOutFile);
ac.setListener(new AudioProcessor.AudioProcessorListener() {
public void onStart() {
Log.d("Audioprocessor", "Audioprocessor is successful");
}
public void onSuccess(File output) {
Log.d("Audioprocessor", "Audioprocessor is successful");
goToPublishView();
}
public void onError(String message) {
System.out.println("Audioprocessor: " + message);
}
public void onFinish() {
Log.d("Audioprocessor", "Audioprocessor is finshed");
}
});
try {
if (tempOutFile.exists()) {
tempOutFile.delete();
}
ac.process();
} catch (Exception ex) {
System.out.println("Processing failed!");
ex.printStackTrace();
}And here is the
AudioProcessor
itself :public class AudioProcessor {
private Context context;
private FFmpeg ffmpeg;
private AudioProcessorListener listener;
private File micPcmFile;
private File backgroundMp3File;
private File pcmtowavTempFile;
private File mp3towavTempFile;
private File combinedwavTempFile;
private File outputFile;
private File volumeChangedTempFile;
TextView extensionDownload, percentProgress;
public AudioProcessor(Context context, Activity activity) {
ffmpeg = null;
ffmpeg = FFmpeg.getInstance(context);
percentProgress = activity.findViewById(R.id.percentProgress);
percentProgress.setSingleLine(false);
this.context = context;
prepare();
}
/**
* Program main method. Starts running program
* @throws Exception
*/
public void process() throws Exception {
if (!ffmpeg.isSupported()) {
Log.e("AudioProcessor", "FFMPEG not supported! Cannot convert audio!");
throw new Exception("FFMPeg has to be supported");
}
if (!checkIfAllFilesPresent()) {
Log.e("AudioProcessor", "All files are not set yet. Please set file first");
throw new Exception("Files are not set!");
}
Log.e("AudioProcessor", "Start processing audio");
listener.onStart();
Handler handler = new Handler();
handler.postDelayed(new Runnable() {
@Override
public void run() {
convertPCMToWav();
}
}, 200);
}
/**
* Prepares program
*/
private void prepare() {
prepareTempFiles();
}
/**
* Converts PCM to wav file. Automatically create new file.
*/
private void convertPCMToWav() {
Log.e("AudioProcessor", "Convert PCM TO Wav");
//ffmpeg -f s16le -ar 44.1k -ac 2 -i file.pcm file.wav
String[] cmd = { "-f" , "s16le", "-ar", "44.1k", "-i", micPcmFile.toString(), "-y", pcmtowavTempFile.toString()};
ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {
super.onStart();
percentProgress.setVisibility(View.VISIBLE);
percentProgress.setText("Converting your recording\n"+"1/5");
}
@Override
public void onSuccess(String message) {
super.onSuccess(message);
convertMP3ToWav();
}
@Override
public void onFailure(String message) {
super.onFailure(message);
onError(message);
convertPCMToWav();
}
});
}
/**
* Converts mp3 file to wav file.
* Automatically creates Wav file
*/
private void convertMP3ToWav() {
Log.e("AudioProcessor", "Convert MP3 TO Wav");
//ffmpeg -i file.mp3 file.wav
String[] cmd = { "-i" , backgroundMp3File.toString(), "-y", mp3towavTempFile.toString() };
ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {
super.onStart();
percentProgress.setText("Converting background audio\n"+"2/5");
Log.e("AudioProcessor", "Starging convertgf MP3 TO Wav");
}
@Override
public void onSuccess(String message) {
super.onSuccess(message);
changeMicAudio();
}
@Override
public void onFailure(String message) {
super.onFailure(message);
Log.e("AudioProcessor", "Failed to convert MP3 TO Wav");
//onError(message);
throw new RuntimeException("Failed to convert MP3 TO Wav");
//convertMP3ToWav();
}
});
}
/**
* Combines 2 wav files into one wav file. Overlays audio
*/
private void combineWavs() {
Log.e("AudioProcessor", "Combine wavs");
//ffmpeg -i C:\Users\VR1\Desktop\_mp3.wav -i C:\Users\VR1\Desktop\_pcm.wav -filter_complex amix=inputs=2:duration=first:dropout_transition=3 C:\Users\VR1\Desktop\out.wav
String[] cmd = { "-i" , pcmtowavTempFile.toString(), "-i", volumeChangedTempFile.toString(), "-filter_complex", "amix=inputs=2:duration=first:dropout_transition=3", "-y",combinedwavTempFile.toString()};
ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {
super.onStart();
percentProgress.setText("Combining the two audio files\n"+"4/5");
}
@Override
public void onSuccess(String message) {
super.onSuccess(message);
encodeWavToAAC();
}
@Override
public void onFailure(String message) {
super.onFailure(message);
onError(message);
}
});
}
private void changeMicAudio(){
Log.e("AudioProcessor", "Change audio volume");
//ffmpeg -i input.wav -filter:a "volume=1.5" output.wav
String[] cmdy = { "-i", mp3towavTempFile.toString(), "-af", "volume=0.9", "-y",volumeChangedTempFile.toString()};
ffmpeg.execute(cmdy, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {
super.onStart();
percentProgress.setText("Normalizing volume\n"+"3/5");
}
@Override
public void onSuccess(String message) {
combineWavs();
super.onSuccess(message);
}
@Override
public void onFailure(String message) {
super.onFailure(message);
Log.e("AudioProcessor", message);
}
});
}
/**
* Do something on error. Releases program data (deletes files)
* @param message
*/
private void onError(String message) {
release();
if (listener != null) {
//listener.onError(message);
}
}
/**
* Encode to AAC
*/
private void encodeWavToAAC() {
Log.e("AudioProcessor", "Encode to AAC");
//ffmpeg -i file.wav -c:a aac -b:a 128k -f adts output.m4a
String[] cmd = { "-i" , combinedwavTempFile.toString(), "-c:a", "aac", "-b:a", "128k", "-f", "adts", "-y",outputFile.toString()};
ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
@Override
public void onStart() {
super.onStart();
percentProgress.setText("Normalizing volume\n"+"3/5");
}
@Override
public void onSuccess(String message) {
super.onSuccess(message);
if (listener != null) {
listener.onSuccess(outputFile);
}
release();
}
@Override
public void onFailure(String message) {
super.onFailure(message);
onError(message);
encodeWavToAAC();
}
});
}
/**
* Uninitializes class
*/
private void release() {
if (listener != null) {
listener.onFinish();
}
destroyTempFiles();
}
/**
* Prepares temp required files by deleteing them if they exsist.
* Files cannot exists before ffmpeg actions. FFMpeg automatically creates those files.
*/
private void prepareTempFiles() {
pcmtowavTempFile = new File(context.getFilesDir()+ Common.TEMP_LOCAL_DIR + "/" + "_pcm.wav");
mp3towavTempFile = new File(context.getFilesDir()+ Common.TEMP_LOCAL_DIR + "/" + "_mp3.wav");
combinedwavTempFile = new File(context.getFilesDir()+ Common.TEMP_LOCAL_DIR + "/" + "_combined.wav");
volumeChangedTempFile = new File(context.getFilesDir()+ Common.TEMP_LOCAL_DIR + "/" + "_volumeChanged.wav");
destroyTempFiles();
}
/**
* Destroys temp required files
*/
private void destroyTempFiles() {
pcmtowavTempFile.delete();
mp3towavTempFile.delete();
combinedwavTempFile.delete();
volumeChangedTempFile.delete();
}
/**
* Checks if all files are set, so we can process them
* @return - all files ready
*/
private boolean checkIfAllFilesPresent() {
if(micPcmFile == null || backgroundMp3File == null || outputFile == null) {
Log.e("AudioProcessor", "All files are not set! Set all files!");
return false;
}
return true;
}
public void setOutputFile(File outputFile) {
this.outputFile = outputFile;
}
public void setListener(AudioProcessorListener listener) {
this.listener = listener;
}
public void setMicPcmFile(File micPcmFile) {
this.micPcmFile = micPcmFile;
}
public void setBackgroundMp3File(File backgroundMp3File) {
this.backgroundMp3File = backgroundMp3File;
}
public interface AudioProcessorListener {
void onStart();
void onSuccess(File output);
void onError(String message);
void onFinish();
}
}How I am usually testing it and getting the crashes is letting the
AudioProcessor
get to the 2nd method, which isconvertMP3ToWav()
and then close the application. When I start it back up again and start processing the files the application crashes.I have tried many ways and I thought about throwing the application back to start when it is minimized with this code in the
Main Activity
@Override
protected void onUserLeaveHint()
{
if (Build.VERSION.SDK_INT >= 21) {
finishAndRemoveTask();
} else {
finish();
}
}I thought that it would stop everything but it still kept crashing. After doing some debugging I found that even when I minimize the app and do the
finishAndRemoveTask()
theAudioProcessor
is still working and it still does all of theffmpeg
commands and even calls theonSuccess()/onFinish()
methods.How can I completely stop everything or at least stop and clear the
AudioProcessor
when the application is minimized ? -
Injecting Metadata into an .mp4 file
21 août 2019, par TrueCP5I’m working on a library that injects metadata into a .mp4 file to allow the video to be displayed correctly as a 360 video. The input file is a standard .mp4 file in the equirectangular format. I know what metadata needs to be injected I just do not know how to inject it.
I spent some time looking around for libraries that can do this but could only find ones for extracting metadata not injecting/embedding/writing it. The alternative I found was to use Spatial Media as a command line application to inject the metadata more easily. The problem is I know zero python whatsoever so I’m leaning towards a library/nuget package/ffmpeg script.
Does a good nuget package/library exist that can do this or should I go for the alternative option ?
Edit 1
I have tried just pasting in the metadata into the correct place in the file, just in case it might work, but it didn’t.
Edit 2
This is the metadata injected by Google’s Spatial Media Tool which is what I am trying to achieve :
<?xml version="1.0"?>truetrueSpherical Metadata Toolequirectangular`
Edit 3
I’ve also tried to do it with ffmpeg like so :
ffmpeg -i input.mp4 -movflags use_metadata_tags -metadata Spherical=true -metadata Stitched=true -metadata ProjectionType=equirectangular -metadata StitchingSoftware=StreetviewJourney -codec copy output.mp4
I think the issue with the ffmpeg method is that it does not contain the
rdf:SphericalVideo
part which allows the spherical video tags to be used.Edit 4
When I extract the metadata using ffmpeg it contains the spherical tag in the logs but not when I output it to a ffmetadata file. This was the command I used :
ffmpeg -i injected.mp4 -map_metadata -1 -f ffmetadata data.txt
This is the output of the log :
fps, 60 tbr, 15360 tbn, 120 tbc (default)
Metadata:
handler_name : VideoHandler
Side data:
spherical: equirectangular (0.000000/0.000000/0.000000)Edit 5
I also tried to get the metadata using this command :
ffprobe -v error -select_streams v:0 -show_streams -of default=noprint_wrappers=1 injected.mp4
This was the logs it outputted :
TAG:handler_name=VideoHandler
side_data_type=Spherical Mapping
projection=equirectangular
yaw=0
pitch=0
roll=0I then tried to use this command but it didn’t work :
ffmpeg -i chapmanspeak.mp4 -movflags use_metadata_tags -metadata side_metadata_type="Spherical Mapping" -metadata projection=equirectangular -metadata yaw=0 -metadata pitch=0 -metadata roll=0 -codec copy output.mp4
Edit 6
I tried @VC.One’s method but I must be doing something wrong because the output file is unplayable. Here is my code :
public static void Metadata(string inputFile, string outputFile)
{
byte[] metadata = HexStringToByteArray("3C 3F 78 6D 6C 20 76 65 72 73 69 6F 6E 3D 22 31 2E 30 22 3F 3E 3C 72 64 66 3A 53 70 68 65 72 69 63 61 6C 56 69 64 65 6F 0A 78 6D 6C 6E 73 3A 72 64 66 3D 22 68 74 74 70 3A 2F 2F 77 77 77 2E 77 33 2E 6F 72 67 2F 31 39 39 39 2F 30 32 2F 32 32 2D 72 64 66 2D 73 79 6E 74 61 78 2D 6E 73 23 22 0A 78 6D 6C 6E 73 3A 47 53 70 68 65 72 69 63 61 6C 3D 22 68 74 74 70 3A 2F 2F 6E 73 2E 67 6F 6F 67 6C 65 2E 63 6F 6D 2F 76 69 64 65 6F 73 2F 31 2E 30 2F 73 70 68 65 72 69 63 61 6C 2F 22 3E 3C 47 53 70 68 65 72 69 63 61 6C 3A 53 70 68 65 72 69 63 61 6C 3E 74 72 75 65 3C 2F 47 53 70 68 65 72 69 63 61 6C 3A 53 70 68 65 72 69 63 61 6C 3E 3C 47 53 70 68 65 72 69 63 61 6C 3A 53 74 69 74 63 68 65 64 3E 74 72 75 65 3C 2F 47 53 70 68 65 72 69 63 61 6C 3A 53 74 69 74 63 68 65 64 3E 3C 47 53 70 68 65 72 69 63 61 6C 3A 53 74 69 74 63 68 69 6E 67 53 6F 66 74 77 61 72 65 3E 53 70 68 65 72 69 63 61 6C 20 4D 65 74 61 64 61 74 61 20 54 6F 6F 6C 3C 2F 47 53 70 68 65 72 69 63 61 6C 3A 53 74 69 74 63 68 69 6E 67 53 6F 66 74 77 61 72 65 3E 3C 47 53 70 68 65 72 69 63 61 6C 3A 50 72 6F 6A 65 63 74 69 6F 6E 54 79 70 65 3E 65 71 75 69 72 65 63 74 61 6E 67 75 6C 61 72 3C 2F 47 53 70 68 65 72 69 63 61 6C 3A 50 72 6F 6A 65 63 74 69 6F 6E 54 79 70 65 3E 3C 2F 72 64 66 3A 53 70 68 65 72 69 63 61 6C 56 69 64 65 6F 3E");
byte[] stco = HexStringToByteArray("73 74 63 6F");
byte[] moov = HexStringToByteArray("6D 6F 6F 76");
byte[] trak = HexStringToByteArray("74 72 61 6B");
byte[] file = File.ReadAllBytes(inputFile);
//find trak
int trakPosition = 0;
for (int a = 0; a < file.Length - trak.Length; a++)
{
for (int b = 0; b < trak.Length; b++)
{
if (file[a + b] != trak[b])
break;
if (b == trak.Length - 1)
trakPosition = a;
}
}
if (trakPosition == 0)
throw new FileLoadException();
//add metadata
int trakLength = BitConverter.ToInt32(new ArraySegment<byte>(file, trakPosition - 4, 4).Reverse().ToArray(), 0);
var fileList = file.ToList();
fileList.InsertRange(trakPosition - 4 + trakLength, metadata);
file = fileList.ToArray();
////change length - tried this as well
//byte[] trakBytes = BitConverter.GetBytes(trakLength + metadata.Length).Reverse().ToArray();
//for (int i = 0; i < 4; i++)
// file[trakPosition - 4 + i] = trakBytes[i];
//find moov
int moovPosition = 0;
for (int a = 0; a < file.Length - moov.Length; a++)
{
for (int b = 0; b < moov.Length; b++)
{
if (file[a + b] != moov[b])
break;
if (b == moov.Length - 1)
moovPosition = a;
}
}
if (moovPosition == 0)
throw new FileLoadException();
//change length
int moovLength = BitConverter.ToInt32(new ArraySegment<byte>(file, moovPosition - 4, 4).Reverse().ToArray(), 0);
byte[] moovBytes = BitConverter.GetBytes(moovLength + metadata.Length).Reverse().ToArray();
for (int i = 0; i < 4; i++)
file[moovPosition - 4 + i] = moovBytes[i];
//find stco
int stcoPosition = 0;
for (int a = 0; a < file.Length - stco.Length; a++)
{
for (int b = 0; b < stco.Length; b++)
{
if (file[a + b] != stco[b])
break;
if (b == stco.Length - 1)
stcoPosition = a;
}
}
if (stcoPosition == 0)
throw new FileLoadException();
//modify entries
int stcoEntries = BitConverter.ToInt32(new ArraySegment<byte>(file, stcoPosition + 8, 4).Reverse().ToArray(), 0);
for (int a = stcoPosition + 12; a < stcoPosition + 12 + stcoEntries * 4; a += 4)
{
int entryLength = BitConverter.ToInt32(new ArraySegment<byte>(file, a, 4).Reverse().ToArray(), 0);
byte[] newEntry = BitConverter.GetBytes(entryLength + metadata.Length).Reverse().ToArray();
for (int b = 0; b < 4; b++)
file[a + b] = newEntry[b];
}
File.WriteAllBytes(outputFile, file);
}
private static byte[] HexStringToByteArray(string hex)
{
hex = hex.Replace(" ", "");
return Enumerable.Range(0, hex.Length)
.Where(x => x % 2 == 0)
.Select(x => Convert.ToByte(hex.Substring(x, 2), 16))
.ToArray();
}
</byte></byte></byte></byte>The bytes are reversed because .mp4s seem to be Little Endian. I tried to also update the length of
trak
but that didn’t work either.