
Recherche avancée
Autres articles (78)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (7940)
-
How to play raw h264 produced by MediaCodec encoder ?
1er novembre 2014, par jackos2500I’m a bit new when it comes to MediaCodec (and video encoding/decoding in general), so correct me if anything I say here is wrong.
I want to play the raw h264 output of MediaCodec with VLC/ffplay. I need this to play becuase my end goal is to stream some live video to a computer, and MediaMuxer only produces a file on disk rather than something I can stream with (very) low latency to a desktop. (I’m open to other solutions, but I have not found anything else that fits the latency requirement)
Here is the code I’m using encode the video and write it to a file : (it’s based off the MediaCodec example found here, only with the MediaMuxer part removed)
package com.jackos2500.droidtop;
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.opengl.EGL14;
import android.opengl.EGLConfig;
import android.opengl.EGLContext;
import android.opengl.EGLDisplay;
import android.opengl.EGLExt;
import android.opengl.EGLSurface;
import android.opengl.GLES20;
import android.os.Environment;
import android.util.Log;
import android.view.Surface;
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
public class StreamH264 {
private static final String TAG = "StreamH264";
private static final boolean VERBOSE = true; // lots of logging
// where to put the output file (note: /sdcard requires WRITE_EXTERNAL_STORAGE permission)
private static final File OUTPUT_DIR = Environment.getExternalStorageDirectory();
public static int MEGABIT = 1000 * 1000;
private static final int IFRAME_INTERVAL = 10;
private static final int TEST_R0 = 0;
private static final int TEST_G0 = 136;
private static final int TEST_B0 = 0;
private static final int TEST_R1 = 236;
private static final int TEST_G1 = 50;
private static final int TEST_B1 = 186;
private MediaCodec codec;
private CodecInputSurface inputSurface;
private BufferedOutputStream out;
private MediaCodec.BufferInfo bufferInfo;
public StreamH264() {
}
private void prepareEncoder() throws IOException {
bufferInfo = new MediaCodec.BufferInfo();
MediaFormat format = MediaFormat.createVideoFormat("video/avc", 1280, 720);
format.setInteger(MediaFormat.KEY_BIT_RATE, 2 * MEGABIT);
format.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
codec = MediaCodec.createEncoderByType("video/avc");
codec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
inputSurface = new CodecInputSurface(codec.createInputSurface());
codec.start();
File dst = new File(OUTPUT_DIR, "test.264");
out = new BufferedOutputStream(new FileOutputStream(dst));
}
private void releaseEncoder() throws IOException {
if (VERBOSE) Log.d(TAG, "releasing encoder objects");
if (codec != null) {
codec.stop();
codec.release();
codec = null;
}
if (inputSurface != null) {
inputSurface.release();
inputSurface = null;
}
if (out != null) {
out.flush();
out.close();
out = null;
}
}
public void stream() throws IOException {
try {
prepareEncoder();
inputSurface.makeCurrent();
for (int i = 0; i < (30 * 5); i++) {
// Feed any pending encoder output into the file.
drainEncoder(false);
// Generate a new frame of input.
generateSurfaceFrame(i);
inputSurface.setPresentationTime(computePresentationTimeNsec(i, 30));
// Submit it to the encoder. The eglSwapBuffers call will block if the input
// is full, which would be bad if it stayed full until we dequeued an output
// buffer (which we can't do, since we're stuck here). So long as we fully drain
// the encoder before supplying additional input, the system guarantees that we
// can supply another frame without blocking.
if (VERBOSE) Log.d(TAG, "sending frame " + i + " to encoder");
inputSurface.swapBuffers();
}
// send end-of-stream to encoder, and drain remaining output
drainEncoder(true);
} finally {
// release encoder, muxer, and input Surface
releaseEncoder();
}
}
private void drainEncoder(boolean endOfStream) throws IOException {
final int TIMEOUT_USEC = 10000;
if (VERBOSE) Log.d(TAG, "drainEncoder(" + endOfStream + ")");
if (endOfStream) {
if (VERBOSE) Log.d(TAG, "sending EOS to encoder");
codec.signalEndOfInputStream();
}
ByteBuffer[] outputBuffers = codec.getOutputBuffers();
while (true) {
int encoderStatus = codec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
// no output available yet
if (!endOfStream) {
break; // out of while
} else {
if (VERBOSE) Log.d(TAG, "no output available, spinning to await EOS");
}
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
// not expected for an encoder
outputBuffers = codec.getOutputBuffers();
} else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// should happen before receiving buffers, and should only happen once
MediaFormat newFormat = codec.getOutputFormat();
Log.d(TAG, "encoder output format changed: " + newFormat);
} else if (encoderStatus < 0) {
Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " + encoderStatus);
// let's ignore it
} else {
ByteBuffer encodedData = outputBuffers[encoderStatus];
if (encodedData == null) {
throw new RuntimeException("encoderOutputBuffer " + encoderStatus + " was null");
}
if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
// The codec config data was pulled out and fed to the muxer when we got
// the INFO_OUTPUT_FORMAT_CHANGED status. Ignore it.
if (VERBOSE) Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
bufferInfo.size = 0;
}
if (bufferInfo.size != 0) {
// adjust the ByteBuffer values to match BufferInfo (not needed?)
encodedData.position(bufferInfo.offset);
encodedData.limit(bufferInfo.offset + bufferInfo.size);
byte[] data = new byte[bufferInfo.size];
encodedData.get(data);
out.write(data);
if (VERBOSE) Log.d(TAG, "sent " + bufferInfo.size + " bytes to file");
}
codec.releaseOutputBuffer(encoderStatus, false);
if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
if (!endOfStream) {
Log.w(TAG, "reached end of stream unexpectedly");
} else {
if (VERBOSE) Log.d(TAG, "end of stream reached");
}
break; // out of while
}
}
}
}
private void generateSurfaceFrame(int frameIndex) {
frameIndex %= 8;
int startX, startY;
if (frameIndex < 4) {
// (0,0) is bottom-left in GL
startX = frameIndex * (1280 / 4);
startY = 720 / 2;
} else {
startX = (7 - frameIndex) * (1280 / 4);
startY = 0;
}
GLES20.glClearColor(TEST_R0 / 255.0f, TEST_G0 / 255.0f, TEST_B0 / 255.0f, 1.0f);
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glEnable(GLES20.GL_SCISSOR_TEST);
GLES20.glScissor(startX, startY, 1280 / 4, 720 / 2);
GLES20.glClearColor(TEST_R1 / 255.0f, TEST_G1 / 255.0f, TEST_B1 / 255.0f, 1.0f);
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glDisable(GLES20.GL_SCISSOR_TEST);
}
private static long computePresentationTimeNsec(int frameIndex, int frameRate) {
final long ONE_BILLION = 1000000000;
return frameIndex * ONE_BILLION / frameRate;
}
/**
* Holds state associated with a Surface used for MediaCodec encoder input.
* <p>
* The constructor takes a Surface obtained from MediaCodec.createInputSurface(), and uses that
* to create an EGL window surface. Calls to eglSwapBuffers() cause a frame of data to be sent
* to the video encoder.
* </p><p>
* This object owns the Surface -- releasing this will release the Surface too.
*/
private static class CodecInputSurface {
private static final int EGL_RECORDABLE_ANDROID = 0x3142;
private EGLDisplay mEGLDisplay = EGL14.EGL_NO_DISPLAY;
private EGLContext mEGLContext = EGL14.EGL_NO_CONTEXT;
private EGLSurface mEGLSurface = EGL14.EGL_NO_SURFACE;
private Surface mSurface;
/**
* Creates a CodecInputSurface from a Surface.
*/
public CodecInputSurface(Surface surface) {
if (surface == null) {
throw new NullPointerException();
}
mSurface = surface;
eglSetup();
}
/**
* Prepares EGL. We want a GLES 2.0 context and a surface that supports recording.
*/
private void eglSetup() {
mEGLDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);
if (mEGLDisplay == EGL14.EGL_NO_DISPLAY) {
throw new RuntimeException("unable to get EGL14 display");
}
int[] version = new int[2];
if (!EGL14.eglInitialize(mEGLDisplay, version, 0, version, 1)) {
throw new RuntimeException("unable to initialize EGL14");
}
// Configure EGL for recording and OpenGL ES 2.0.
int[] attribList = {
EGL14.EGL_RED_SIZE, 8,
EGL14.EGL_GREEN_SIZE, 8,
EGL14.EGL_BLUE_SIZE, 8,
EGL14.EGL_ALPHA_SIZE, 8,
EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,
EGL_RECORDABLE_ANDROID, 1,
EGL14.EGL_NONE
};
EGLConfig[] configs = new EGLConfig[1];
int[] numConfigs = new int[1];
EGL14.eglChooseConfig(mEGLDisplay, attribList, 0, configs, 0, configs.length,
numConfigs, 0);
checkEglError("eglCreateContext RGB888+recordable ES2");
// Configure context for OpenGL ES 2.0.
int[] attrib_list = {
EGL14.EGL_CONTEXT_CLIENT_VERSION, 2,
EGL14.EGL_NONE
};
mEGLContext = EGL14.eglCreateContext(mEGLDisplay, configs[0], EGL14.EGL_NO_CONTEXT,
attrib_list, 0);
checkEglError("eglCreateContext");
// Create a window surface, and attach it to the Surface we received.
int[] surfaceAttribs = {
EGL14.EGL_NONE
};
mEGLSurface = EGL14.eglCreateWindowSurface(mEGLDisplay, configs[0], mSurface,
surfaceAttribs, 0);
checkEglError("eglCreateWindowSurface");
}
/**
* Discards all resources held by this class, notably the EGL context. Also releases the
* Surface that was passed to our constructor.
*/
public void release() {
if (mEGLDisplay != EGL14.EGL_NO_DISPLAY) {
EGL14.eglMakeCurrent(mEGLDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE,
EGL14.EGL_NO_CONTEXT);
EGL14.eglDestroySurface(mEGLDisplay, mEGLSurface);
EGL14.eglDestroyContext(mEGLDisplay, mEGLContext);
EGL14.eglReleaseThread();
EGL14.eglTerminate(mEGLDisplay);
}
mSurface.release();
mEGLDisplay = EGL14.EGL_NO_DISPLAY;
mEGLContext = EGL14.EGL_NO_CONTEXT;
mEGLSurface = EGL14.EGL_NO_SURFACE;
mSurface = null;
}
/**
* Makes our EGL context and surface current.
*/
public void makeCurrent() {
EGL14.eglMakeCurrent(mEGLDisplay, mEGLSurface, mEGLSurface, mEGLContext);
checkEglError("eglMakeCurrent");
}
/**
* Calls eglSwapBuffers. Use this to "publish" the current frame.
*/
public boolean swapBuffers() {
boolean result = EGL14.eglSwapBuffers(mEGLDisplay, mEGLSurface);
checkEglError("eglSwapBuffers");
return result;
}
/**
* Sends the presentation time stamp to EGL. Time is expressed in nanoseconds.
*/
public void setPresentationTime(long nsecs) {
EGLExt.eglPresentationTimeANDROID(mEGLDisplay, mEGLSurface, nsecs);
checkEglError("eglPresentationTimeANDROID");
}
/**
* Checks for EGL errors. Throws an exception if one is found.
*/
private void checkEglError(String msg) {
int error;
if ((error = EGL14.eglGetError()) != EGL14.EGL_SUCCESS) {
throw new RuntimeException(msg + ": EGL error: 0x" + Integer.toHexString(error));
}
}
}
}
</p>However, the file produced from this code does not play with VLC or ffplay. Can anyone tell me what I’m doing wrong ? I believe it is due to an incorrect format (or total lack) of headers required for the playing of raw h264, as I have had success playing .264 files downloaded from the internet with ffplay. Also, I’m not sure exactly how I’m going to stream this video to a computer, so if somebody could give me some suggestions as to how I might do that, I would be very grateful ! Thanks !
-
Vedanti and Max Sound vs. Google
14 août 2014, par Multimedia Mike — Legal/EthicalVedanti Systems Limited (VSL) and Max Sound Coporation filed a lawsuit against Google recently. Ordinarily, I wouldn’t care about corporate legal battles. However, this one interests me because it’s multimedia-related. I’m curious to know how coding technology patents might hold up in a real court case.
Here’s the most entertaining complaint in the lawsuit :
Despite Google’s well-publicized Code of Conduct — “Don’t be Evil” — which it explains is “about doing the right thing,” “following the law,” and “acting honorably,” Google, in fact, has an established pattern of conduct which is the exact opposite of its claimed piety.
I wonder if this is the first known case in which Google has been sued over its long-obsoleted “Don’t be evil” mantra ?
Researching The Plaintiffs
I think I made a mistake by assuming this lawsuit might have merit. My first order of business was to see what the plaintiff organizations have produced. I have a strong feeling that these might be run of the mill patent trolls.VSL currently has a blank web page. Further, the Wayback Machine only has pages reaching back to 2011. The earliest page lists these claims against a plain black background (I’ve highlighted some of the more boisterous claims and the passages that make it appear that Vedanti doesn’t actually produce anything but is strictly an IP organization) :
The inventions key :
The patent and software reduced any data content, without compressing, up to a 97% total reduction of the data which also produces a lossless result. This physics based invention is often called the Holy Grail.Vedanti Systems Intellectual Property
Our strategic IP portfolio is granted in all of the world’s largest technology development and use countries. A major value indemnification of our licensee products is the early date of invention filing and subsequent Issue. Vedanti IP has an intrinsic 20 year patent protection and valuation in royalties and licensing. The original data transmission art has no prior art against it.Vedanti Systems invented among other firsts, The Slice and Partitioning of Macroblocks within a RGB Tri level region in a frame to select or not, the pixel.
Vedanti Systems invention is used in nearly every wireless chipset and handset in the world
Our original pixel selection system revolutionized wireless handset communications. An example of this system “Slice” and “Macroblock Partitioning” is used throughout Satellite channel expansion, Wireless partitioning, Telecom – Video Conferencing, Surveillance Cameras, and 2010 developing Media applications.
Vedanti Systems is a Semiconductor based software, applications, and IP Continuations Intellectual Property company.
Let’s move onto the other plaintiff, Max Sound. They have a significantly more substantive website. They also have an Android app named Spins HD Audio, which appears to be little more than a music player based on the screenshots.
Max Sound also has a stock ticker symbol : MAXD. Something clicked into place when I looked up their ticker symbol : While worth only a few pennies, it was worth a few more pennies after this lawsuit was announced, which might be one of the motivations behind the lawsuit.
Here’s a trick I learned when I was looking for a new tech job last year : When I first look at a company’s website and am trying to figure out what they really do, I head straight to their jobs/careers page. A lot of corporate websites have way too much blathering corporatese that can be tough to cut through. But when I see what mix of talent and specific skills they are hoping to hire, that gives me a much better portrait of what the company does.
The reason I bring this up is because this tech company doesn’t seem to have jobs/careers page.
The Lawsuit
The core complaint centers around Patent 7974339 : Optimized data transmission system and method. It was filed in July 2004 (or possibly as early as January 2002), issued in July 2011, and assigned (purchased ?) by Vedanti in May 2012. The lawsuit alleges that nearly everything Google has ever produced (or, more accurately, purchased) leverages the patented technology.The patent itself has 5 drawings. If you’ve ever seen a multimedia codec patent, or any whitepaper on a multimedia codec, you’ve seen these graphs before. E.g., “Raw pixels come in here -> some analysis happens here -> more analysis happens over here -> entropy coding -> final bitstream”. The text of a patent document isn’t meant to be particularly useful. I’ve tried to understand this stuff before and it never goes well. Skimming the text, I just see a blur of the words data, transmission, pixel, and matrix.
So I read the complaint to try to figure out what this is all about. To summarize the storyline as narrated by the lawsuit, some inventors were unhappy with the state of video compression in 2001 and endeavored to create something better. So they did, and called it the VSL codec. This codec is so far undocumented on the MultimediaWiki, so it probably has yet to be seen “in the wild”. Good luck finding hard technical data on it now since searches for “VSL codec” are overwhelmed by articles about this lawsuit. Also, the original codec probably wasn’t called VSL because VSL is apparently an IP organization formed much later.
Then, the protagonists of the lawsuit patented the codec. Then, years later, Google wanted to purchase a video codec that they could open source and use to supplant H.264.
The complaint goes on to allege that in 2010, Google specifically contacted VSL to possibly license or acquire this mysterious VSL technology. Google was allegedly allowed to study the technology, eventually decided not to continue discussions, and shipped back the proprietary materials.
Here’s where things get weird. When Google shipped back the materials, they allegedly shipped back a bunch of Post-It notes. The notes are alleged to contain a ton of incriminating evidence. The lawsuit claims that the notes contained such tidbits as :
- Google was concerned that its infringement could be considered “recklessness” (the standard applicable to willful infringement) ;
- Google personnel should “try” to destroy incriminating emails ;
- Google should consider a “design around” because it was facing a “risk of litigation.”
Actually, given Google’s acquisition of On2, I can totally believe that last one (On2’s codecs have famously contained a lot of weirdness which is commonly suspected to be attributable to designing around known patents).
Anyway, a lot of this case seems to hinge on the authenticity of these Post-It notes :
“65. The Post-It notes are unequivocal evidence of Google’s knowledge of the ’339 Patent and infringement by Defendants”
I wish I could find a stock photo of a stack of Post-It notes in an evidence bag.
I’ve worked at big technology companies. Big tech companies these days are very diligent about indoctrinating employees about IP liability issues. The reason this Post-It situation strikes me as odd is because the alleged contents of the notes basically outline everything the corporate lawyers tell you NOT to do.
Analysis
I’m trying to determine what specific algorithms and coding techniques. I guess I was expecting to see a specific claim that, “Our patent outlines this specific coding technique and here is unequivocal proof that Google A) uses the same technique, and B) specifically did so after looking at our patent.” I didn’t find that (well, a bit of part B, c.f., the Post-It note debacle), but maybe that’s not how these patent lawsuits operate. I’ve never kept up before.Maybe it’s just a patent troll. Maybe it’s for the stock bump. I’m expecting to see pump-n-dump stock spam featuring the stock symbol MAXD anytime now.
I’ve never been interested in following a lawsuit case carefully before. I suddenly find myself wondering if I can subscribe to the RSS feed for this case ? Too much to hope for. But I found this item through Pando and maybe they’ll stay on top of it.
-
FFMPeg quick P-Frame decoding
22 août 2014, par Lior OhanaWe’re using FFmpeg to decode H.264 live video streams.
When we get to a point where we’re not able to keep up with the live frame-rate (i.e high CPU) we start to drop frames before they pass the decoder.
Obviously, once we drop a frame, I need to drop all next P-Frames until the next I-Frame (GOP).The result is that sometimes we have several seconds without video (until an I-Frame arrives).
Is there anyway in FFmpeg to have a "partial" P-Frame decoding that will reduce the decoding time on one hand (assuming I don’t need the uncompressed frame) while keeping the decoder in a "valid" state for being able to decode the P-Frames afterwards ?
Thanks,
Lior.