
Recherche avancée
Autres articles (50)
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Changer son thème graphique
22 février 2011, parLe thème graphique ne touche pas à la disposition à proprement dite des éléments dans la page. Il ne fait que modifier l’apparence des éléments.
Le placement peut être modifié effectivement, mais cette modification n’est que visuelle et non pas au niveau de la représentation sémantique de la page.
Modifier le thème graphique utilisé
Pour modifier le thème graphique utilisé, il est nécessaire que le plugin zen-garden soit activé sur le site.
Il suffit ensuite de se rendre dans l’espace de configuration du (...)
Sur d’autres sites (5770)
-
Dreamcast SD Adapter and DreamShell
31 décembre 2014, par Multimedia Mike — Sega DreamcastNope ! I’m never going to let go of the Sega Dreamcast hacking. When I was playing around with Dreamcast hacking early last year, I became aware that there is such a thing as an SD card adapter for the DC that plugs into the port normally reserved for the odd DC link cable. Of course I wanted to see what I could do with it.
The primary software that leverages the DC SD adapter is called DreamShell. Working with this adapter and the software requires some skill and guesswork. Searching for these topics tends to turn up results from various forums where people are trying to cargo-cult their way to solutions. I have a strange feeling that this post might become the unofficial English-language documentation on the matter.
Use Cases
What can you do with this thing ? Undoubtedly, the primary use is for backing up (ripping) the contents of GD-ROMs (the custom optical format used for the DC) and playing those backed up (ripped) copies. Presumably, users of this device leverage the latter use case more than the former, i.e., download ripped games, load them on the SD card, and launch them using DreamShell.However, there are other uses such as multimedia playback, system exploration, BIOS reprogramming, high-level programming, and probably a few other things I haven’t figured out yet.
Delivery
I put in an order via the dc-sd.com website and in about 2 short months, the item arrived from China. This marked my third lifetime delivery from China and curiously, all 3 of the shipments have pertained to the Sega Dreamcast.
I thought it was very interesting that this adapter came in such complete packaging. The text is all in Chinese, though the back states “Windows 98 / ME / 2000 / XP, Mac OS 9.1, LINUX2.4”. That’s what tipped me off that they must have just cannibalized some old USB SD card readers and packaging in order to create these. Closer inspection of the internals through the translucent pink case confirms this.
Usage
According to its change log, DreamShell has been around for a long time with version 1.0.0 released in February of 2004. The current version is 4.0.0 RC3. There are several downloads available :- DreamShell 4.0 RC 3 CDI Image
- DreamShell 4.0 RC 3 + Boot Loader
- DreamShell 4.0 RC 3 + Core CDI image
Option #2 worked for me. It contains a CDI disc image and the DreamShell files in a directory named DS/.
Burn the CDI to a CD-R in the normal way you would burn a bootable Dreamcast disc from a CDI image. This is open-ended and left as an exercise to the reader, since there are many procedures depending on platform. On Linux, I used a small script I found once called burncdi-dc.sh.
Then, copy the contents of the DS/ folder to an SD card. As for filesystem, FAT16 and FAT32 are both known to work. The files in DS/ should land in the root of the SD card ; the folder DS/ should not be in the root.
Plug the SD card into the DC SD adapter and plug the adapter in the link cable port on the back of the Dreamcast. Then, boot the disc. If it works, you will see this minor corruption of the usual Sega licensing screen :
Then, there will be a brief white-on-black text screen that explains the booting process :
Then, there will be the main DreamShell logo :
Finally, you will land on the DreamShell main desktop :
Skepticism
At first, I was supremely skeptical of the idea that this SD adapter could perform speedily enough to play games reasonably. This was predicated on the observation that my DC coder’s cable that I used to use for homebrew development could not transfer faster than 115200 bits/second, amounting to about 11 kbytes/sec. I assumed that this was a fundamental limitation of the link port.In fact, I ripped a few of my Dreamcast discs over a decade ago and still have those rips lying around. So I copied the ISO image of Resident Evil : Code Veronica — the game I personally played most on the DC — to the SD card (anywhere works) and used the “ISO loader” icon seen on the desktop above to launch the game.
It works :
The opening FMV plays at full speed. Everything loads as fast as I remember. I was quite surprised.
Digression : My assumptions about serial speeds have often been mistaken. 10 years ago, I heard stories about how we would soon be able to watch streaming video on our cell phones. I scoffed because I thought the 56K limitation of dialup modems was some sort of fundamental speed-of-light type of limitation for telephony bandwidth, wired or wireless.
The desktop menu also includes a ‘speedtest’ tool that profiles the write and read performance of your preferred storage medium. For my fastest SD card (a PNY 2 GB card) :
This is probably more representative of the true adapter bandwidth as reading and writing is a good deal faster through more modern interfaces on PC and Mac with this same card.
Look at the other options on the speedtest console. Hard drive ? Apparently, it’s possible, but it requires a good deal more hardware hacking than just purchasing this SD adapter.
Ripping
As you can see from the Resident Evil screenshot, playing games works quite nicely. How about ripping ? I’m pleased to say that DreamShell has a beautiful ripping interface :
Enter a name for the disc (or read the disc label), select the storage medium, and let it, well, rip. It indicates which track it’s working on and the Sega logo acts as a progress bar, shading blue as the track rip progresses.
I’m finally, efficiently, archiving that collection of Sega Dreamcast demo discs ; I’m hoping they’ll eventually find a home at the Internet Archive. How is overall ripping performance ? Usually about 38-40 minutes to rip a full 900-1000 MB. That certainly beats the 27-28 hours that were required when I performed the ripping at 11 kbytes/sec via the DC coders cable.
All is well until I get a sector reading error :
That’s when it can come in handy to have 3 DC consoles (see ?! not crazy !).
Other Uses
There’s a file explorer. You can browse the filesystem of the SD card, visual memory unit, or the CD portion of the GD-ROM (would be more useful if it accessed the GD area). There are FFmpeg files included. So I threw a random Cinepak file and random MPEG-1 file at it to see what happens. MPEG-1 didn’t do anything, but this Cinepak file from some Sierra game played handily :
If you must enter strings, it helps to have a Dreamcast keyboard (which I do). Failing that, here’s a glimpse of the onscreen keyboard that DreamShell equips :
Learning to use it is a game in itself.
There is an option of installing DreamShell in the BIOS. I did not attempt this. I don’t know if it’s possible (not like there’s a lot of documentation)– perhaps a custom BIOS modchip is needed. But here’s what the screen looks like :
There is also a plain console to interact with (better have a physical keyboard). There are numerous file manipulation commands and custom system interaction commands. I see one interesting command called ‘addr’ that looks useful for dumping memory regions to a file.
A Lua language interpreter is also built in. I would love to play with this if I could ascertain whether DreamShell provided Dreamcast-specific APIs.
Tips And Troubleshooting
I have 3 Dreamcast consoles, affectionately named Terran, Protoss, and Zerg after the StarCraft II stickers with which they are adorned. Some seem to work better than others. Protoss seemed to be able to boot the DreamShell disc more reliably than the others. However, I was alarmed when it couldn’t boot one morning when it was churning the previous day.I think the problem is that it was just cold. That seemed to be the issue. I put in a normal GD-ROM and let it warm up on that disc for awhile and then DreamShell booted fine. So that’s my piece of cargo-culting troubleshooting advice.
-
The Ultimate Guide to HeatMap Software
-
VLC dead input for RTP stream
27 mars, par CaptainCheeseI'm working on creating an rtp stream that's meant to display live waveform data from Pioneer prolink players. The motivation for sending this video out is to be able to receive it in a flutter frontend. I initially was just sending a base-24 encoding of the raw ARGB packed ints per frame across a Kafka topic to it but processing this data in flutter proved to be untenable and was bogging down the main UI thread. Not sure if this is the most optimal way of going about this but just trying to get anything to work if it means some speedup on the frontend. So the issue the following implementation is experiencing is that when I run
vlc --rtsp-timeout=120000 --network-caching=30000 -vvvv stream_1.sdp
where

% cat stream_1.sdp
v=0
o=- 0 1 IN IP4 127.0.0.1
s=RTP Stream
c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat
m=video 5007 RTP/AVP 96
a=rtpmap:96 H264/90000



I see (among other questionable logs) the following :


[0000000144c44d10] live555 demux error: no data received in 10s, aborting
[00000001430ee2f0] main input debug: EOF reached
[0000000144b160c0] main decoder debug: killing decoder fourcc `h264'
[0000000144b160c0] main decoder debug: removing module "videotoolbox"
[0000000144b164a0] main packetizer debug: removing module "h264"
[0000000144c44d10] main demux debug: removing module "live555"
[0000000144c45bb0] main stream debug: removing module "record"
[0000000144a64960] main stream debug: removing module "cache_read"
[0000000144c29c00] main stream debug: removing module "filesystem"
[00000001430ee2f0] main input debug: Program doesn't contain anymore ES
[0000000144806260] main playlist debug: dead input
[0000000144806260] main playlist debug: changing item without a request (current 0/1)
[0000000144806260] main playlist debug: nothing to play
[0000000142e083c0] macosx interface debug: Playback has been ended
[0000000142e083c0] macosx interface debug: Releasing IOKit system sleep blocker (37463)



This is sort of confusing because when I run
ffmpeg -protocol_whitelist file,crypto,data,rtp,udp -i stream_1.sdp -vcodec libx264 -f null -

I see a number logs about

[h264 @ 0x139304080] non-existing PPS 0 referenced
 Last message repeated 1 times
[h264 @ 0x139304080] decode_slice_header error
[h264 @ 0x139304080] no frame!



After which I see the stream is received and I start getting telemetry on it :


Input #0, sdp, from 'stream_1.sdp':
 Metadata:
 title : RTP Stream
 Duration: N/A, start: 0.016667, bitrate: N/A
 Stream #0:0: Video: h264 (Constrained Baseline), yuv420p(progressive), 1200x200, 60 fps, 60 tbr, 90k tbn
Stream mapping:
 Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Press [q] to stop, [?] for help
[libx264 @ 0x107f04f40] using cpu capabilities: ARMv8 NEON
[libx264 @ 0x107f04f40] profile High, level 3.1, 4:2:0, 8-bit
Output #0, null, to 'pipe:':
 Metadata:
 title : RTP Stream
 encoder : Lavf61.7.100
 Stream #0:0: Video: h264, yuv420p(tv, progressive), 1200x200, q=2-31, 60 fps, 60 tbn
 Metadata:
 encoder : Lavc61.19.101 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
[out#0/null @ 0x60000069c000] video:144KiB audio:0KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: unknown
frame= 1404 fps= 49 q=-1.0 Lsize=N/A time=00:00:23.88 bitrate=N/A speed=0.834x



Not sure why VLC is turning me down like some kind of Berghain bouncer that lets nobody in the entire night.


I initially tried just converting the ARGB ints to a YUV420p buffer and used this to create the Frame objects but I couldn't for the life of me figure out how to properly initialize it as the attempts I made kept spitting out garbled junk.


Please go easy on me, I've made an unhealthy habit of resolving nearly all of my coding questions by simply lurking the internet for answers but that's not really helping me solve this issue.


Here's the Java I'm working on (the meat of the rtp comms occurs within
updateWaveformForPlayer()
) :

package com.bugbytz.prolink;

import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.bytedeco.ffmpeg.global.avcodec;
import org.bytedeco.ffmpeg.global.avutil;
import org.bytedeco.javacv.FFmpegFrameGrabber;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.FFmpegLogCallback;
import org.bytedeco.javacv.Frame;
import org.bytedeco.javacv.FrameGrabber;
import org.deepsymmetry.beatlink.CdjStatus;
import org.deepsymmetry.beatlink.DeviceAnnouncement;
import org.deepsymmetry.beatlink.DeviceAnnouncementAdapter;
import org.deepsymmetry.beatlink.DeviceFinder;
import org.deepsymmetry.beatlink.Util;
import org.deepsymmetry.beatlink.VirtualCdj;
import org.deepsymmetry.beatlink.data.BeatGridFinder;
import org.deepsymmetry.beatlink.data.CrateDigger;
import org.deepsymmetry.beatlink.data.MetadataFinder;
import org.deepsymmetry.beatlink.data.TimeFinder;
import org.deepsymmetry.beatlink.data.WaveformDetail;
import org.deepsymmetry.beatlink.data.WaveformDetailComponent;
import org.deepsymmetry.beatlink.data.WaveformFinder;

import java.awt.*;
import java.awt.image.BufferedImage;
import java.io.File;
import java.nio.ByteBuffer;
import java.text.DecimalFormat;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Map;
import java.util.Properties;
import java.util.Set;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.ScheduledFuture;
import java.util.concurrent.TimeUnit;

import static org.bytedeco.ffmpeg.global.avutil.AV_PIX_FMT_RGB24;

public class App {
 public static ArrayList<track> tracks = new ArrayList<>();
 public static boolean dbRead = false;
 public static Properties props = new Properties();
 private static Map recorders = new HashMap<>();
 private static Map frameCount = new HashMap<>();

 private static final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);
 private static final int FPS = 60;
 private static final int FRAME_INTERVAL_MS = 1000 / FPS;

 private static Map schedules = new HashMap<>();

 private static Set<integer> streamingPlayers = new HashSet<>();

 public static String byteArrayToMacString(byte[] macBytes) {
 StringBuilder sb = new StringBuilder();
 for (int i = 0; i < macBytes.length; i++) {
 sb.append(String.format("%02X%s", macBytes[i], (i < macBytes.length - 1) ? ":" : ""));
 }
 return sb.toString();
 }

 private static void updateWaveformForPlayer(int player) throws Exception {
 Integer frame_for_player = frameCount.get(player);
 if (frame_for_player == null) {
 frame_for_player = 0;
 frameCount.putIfAbsent(player, frame_for_player);
 }

 if (!WaveformFinder.getInstance().isRunning()) {
 WaveformFinder.getInstance().start();
 }
 WaveformDetail detail = WaveformFinder.getInstance().getLatestDetailFor(player);

 if (detail != null) {
 WaveformDetailComponent component = (WaveformDetailComponent) detail.createViewComponent(
 MetadataFinder.getInstance().getLatestMetadataFor(player),
 BeatGridFinder.getInstance().getLatestBeatGridFor(player)
 );
 component.setMonitoredPlayer(player);
 component.setPlaybackState(player, TimeFinder.getInstance().getTimeFor(player), true);
 component.setAutoScroll(true);
 int width = 1200;
 int height = 200;
 Dimension dimension = new Dimension(width, height);
 component.setPreferredSize(dimension);
 component.setSize(dimension);
 component.setScale(1);
 component.doLayout();

 // Create a fresh BufferedImage and clear it before rendering
 BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);
 Graphics2D g = image.createGraphics();
 g.clearRect(0, 0, width, height); // Clear any old content

 // Draw waveform into the BufferedImage
 component.paint(g);
 g.dispose();

 int port = 5004 + player;
 String inputFile = port + "_" + frame_for_player + ".mp4";
 // Initialize the FFmpegFrameRecorder for YUV420P
 FFmpegFrameRecorder recorder_file = new FFmpegFrameRecorder(inputFile, width, height);
 FFmpegLogCallback.set(); // Enable FFmpeg logging for debugging
 recorder_file.setFormat("mp4");
 recorder_file.setVideoCodec(avcodec.AV_CODEC_ID_H264);
 recorder_file.setPixelFormat(avutil.AV_PIX_FMT_YUV420P); // Use YUV420P format directly
 recorder_file.setFrameRate(FPS);

 // Set video options
 recorder_file.setVideoOption("preset", "ultrafast");
 recorder_file.setVideoOption("tune", "zerolatency");
 recorder_file.setVideoOption("x264-params", "repeat-headers=1");
 recorder_file.setGopSize(FPS);
 try {
 recorder_file.start(); // Ensure this is called before recording any frames
 System.out.println("Recorder started successfully for player: " + player);
 } catch (org.bytedeco.javacv.FFmpegFrameRecorder.Exception e) {
 e.printStackTrace();
 }

 // Get all pixels in one call
 int[] pixels = new int[width * height];
 image.getRGB(0, 0, width, height, pixels, 0, width);
 recorder_file.recordImage(width,height,Frame.DEPTH_UBYTE,1,3 * width, AV_PIX_FMT_RGB24, ByteBuffer.wrap(argbToByteArray(pixels, width, height)));
 recorder_file.stop();
 recorder_file.release();
 final FFmpegFrameRecorder recorder = recorders.get(player);
 FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(inputFile);


 try {
 grabber.start();
 } catch (Exception e) {
 e.printStackTrace();
 }
 if (recorder == null) {
 try {
 String outputStream = "rtp://127.0.0.1:" + port;
 FFmpegFrameRecorder initial_recorder = new FFmpegFrameRecorder(outputStream, grabber.getImageWidth(), grabber.getImageHeight());
 initial_recorder.setFormat("rtp");
 initial_recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
 initial_recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
 initial_recorder.setFrameRate(grabber.getFrameRate());
 initial_recorder.setGopSize(FPS);
 initial_recorder.setVideoOption("x264-params", "keyint=60");
 initial_recorder.setVideoOption("rtsp_transport", "tcp");
 initial_recorder.start();
 recorders.putIfAbsent(player, initial_recorder);
 frameCount.putIfAbsent(player, 0);
 putToRTP(player, grabber, initial_recorder);
 }
 catch (Exception e) {
 e.printStackTrace();
 }
 }
 else {
 putToRTP(player, grabber, recorder);
 }
 File file = new File(inputFile);
 if (file.exists() && file.delete()) {
 System.out.println("Successfully deleted file: " + inputFile);
 } else {
 System.out.println("Failed to delete file: " + inputFile);
 }
 }
 }

 public static void putToRTP(int player, FFmpegFrameGrabber grabber, FFmpegFrameRecorder recorder) throws FrameGrabber.Exception {
 final Frame frame = grabber.grabFrame();
 int frameCount_local = frameCount.get(player);
 frame.keyFrame = frameCount_local++ % FPS == 0;
 frameCount.put(player, frameCount_local);
 try {
 recorder.record(frame);
 } catch (FFmpegFrameRecorder.Exception e) {
 throw new RuntimeException(e);
 }
 }
 public static byte[] argbToByteArray(int[] argb, int width, int height) {
 int totalPixels = width * height;
 byte[] byteArray = new byte[totalPixels * 3]; // 4 bytes per pixel (ARGB)

 for (int i = 0; i < totalPixels; i++) {
 int argbPixel = argb[i];

 byteArray[i * 3] = (byte) ((argbPixel >> 16) & 0xFF); // Red
 byteArray[i * 3 + 1] = (byte) ((argbPixel >> 8) & 0xFF); // Green
 byteArray[i * 3 + 2] = (byte) (argbPixel & 0xFF); // Blue
 }

 return byteArray;
 }


 public static void main(String[] args) throws Exception {
 VirtualCdj.getInstance().setDeviceNumber((byte) 4);
 CrateDigger.getInstance().addDatabaseListener(new DBService());
 props.put("bootstrap.servers", "localhost:9092");
 props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
 props.put("value.serializer", "com.bugbytz.prolink.CustomSerializer");
 props.put(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, "20971520");

 VirtualCdj.getInstance().addUpdateListener(update -> {
 if (update instanceof CdjStatus) {
 try (Producer producer = new KafkaProducer<>(props)) {
 DecimalFormat df_obj = new DecimalFormat("#.##");
 DeviceStatus deviceStatus = new DeviceStatus(
 update.getDeviceNumber(),
 ((CdjStatus) update).isPlaying() || !((CdjStatus) update).isPaused(),
 ((CdjStatus) update).getBeatNumber(),
 update.getBeatWithinBar(),
 Double.parseDouble(df_obj.format(update.getEffectiveTempo())),
 Double.parseDouble(df_obj.format(Util.pitchToPercentage(update.getPitch()))),
 update.getAddress().getHostAddress(),
 byteArrayToMacString(DeviceFinder.getInstance().getLatestAnnouncementFrom(update.getDeviceNumber()).getHardwareAddress()),
 ((CdjStatus) update).getRekordboxId(),
 update.getDeviceName()
 );
 ProducerRecord record = new ProducerRecord<>("device-status", "device-" + update.getDeviceNumber(), deviceStatus);
 try {
 producer.send(record).get();
 } catch (InterruptedException ex) {
 throw new RuntimeException(ex);
 } catch (ExecutionException ex) {
 throw new RuntimeException(ex);
 }
 producer.flush();
 if (!WaveformFinder.getInstance().isRunning()) {
 try {
 WaveformFinder.getInstance().start();
 } catch (Exception ex) {
 throw new RuntimeException(ex);
 }
 }
 }
 }
 });
 DeviceFinder.getInstance().addDeviceAnnouncementListener(new DeviceAnnouncementAdapter() {
 @Override
 public void deviceFound(DeviceAnnouncement announcement) {
 if (!streamingPlayers.contains(announcement.getDeviceNumber())) {
 streamingPlayers.add(announcement.getDeviceNumber());
 schedules.putIfAbsent(announcement.getDeviceNumber(), scheduler.scheduleAtFixedRate(() -> {
 try {
 Runnable task = () -> {
 try {
 updateWaveformForPlayer(announcement.getDeviceNumber());
 } catch (InterruptedException e) {
 System.out.println("Thread interrupted");
 } catch (Exception e) {
 throw new RuntimeException(e);
 }
 System.out.println("Lambda thread work completed!");
 };
 task.run();
 } catch (Exception e) {
 e.printStackTrace();
 }
 }, 0, FRAME_INTERVAL_MS, TimeUnit.MILLISECONDS));
 }
 }

 @Override
 public void deviceLost(DeviceAnnouncement announcement) {
 if (streamingPlayers.contains(announcement.getDeviceNumber())) {
 schedules.get(announcement.getDeviceNumber()).cancel(true);
 streamingPlayers.remove(announcement.getDeviceNumber());
 }
 }
 });
 BeatGridFinder.getInstance().start();
 MetadataFinder.getInstance().start();
 VirtualCdj.getInstance().start();
 TimeFinder.getInstance().start();
 DeviceFinder.getInstance().start();
 CrateDigger.getInstance().start();

 try {
 LoadCommandConsumer consumer = new LoadCommandConsumer("localhost:9092", "load-command-group");
 Thread consumerThread = new Thread(consumer::startConsuming);
 consumerThread.start();

 Runtime.getRuntime().addShutdownHook(new Thread(() -> {
 consumer.shutdown();
 try {
 consumerThread.join();
 } catch (InterruptedException e) {
 Thread.currentThread().interrupt();
 }
 }));
 Thread.sleep(60000);
 } catch (InterruptedException e) {
 System.out.println("Interrupted, exiting.");
 }
 }
}
</integer></track>