Recherche avancée

Médias (1)

Mot : - Tags -/berlin

Autres articles (49)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

Sur d’autres sites (9377)

  • How Media Analytics for Piwik gives you the insights you need to measure how effective your video and audio marketing is – Part 2

    https://piwik.org/media.mp4
    2 février 2017, par InnoCraft — Community

    In Part 1 we have covered some of the Media Analytics features and explained why you cannot afford to not measure the media usage on your website. Chances are, you are wasting or losing money and time by not making the most out of your marketing strategy this very second. In this part, we continue showing you some more insights you can expect to get from Media Analytics and how nicely it is integrated into Piwik.

    Video, Audio and Media Player reports

    Media Analytics adds several new reports around videos, audios and media players. They are all quite similar and give you similar insights so we will mainly focus on the Video Titles report.

    Metrics

    The above mentioned reports give you all the same insights and features so we will mainly focus on the “Video Titles” report. When you open such a report for the first time, you will see a report like this with the following metrics :

    • “Impressions”, the number of times a visitor has viewed a page where this media was included.
    • “Plays”, the number of times a visitor watched or listened to this media.
    • “Play rate”, the percentage of visitors that watched or listened to a media after they have visited a page where this media was included.
    • “Finishes”, the percentage of visitors who played a media and finished it.
    • “Avg. time spent”, the average amount of time a visitor spent watching or listening to this media.
    • “Avg. media length” the average length of a video or audio media file. This number may vary for example if the media is a stream.
    • “Avg completion” the percentage of how much visitors have watched of a video.

    If you are not sure what a certain metric means, simply hover the metric title in the UI and you will get a detailed explanation. By changing the visualization to the “All Columns Table” in the bottom of the report, you get to see even more metrics like “Plays by unique visitors”, “Impressions by unique visitors”, “Finish rate”, “Avg. time to play aka hesitation time”, “Fullscreen rate” and we are always adding more metrics.

    These metrics are available for the following reports :

    • “Video / Audio Titles” shows you all metrics aggregated by video or audio title
    • “Video / Audio Resource URLs” shows you all metrics aggregated by the video or audio resource URL, for example “https://piwik.org/media.mp4”.
    • “Video / Audio Resource URLs grouped” removes some information from the URLs like subdomain, file extensions and other information to get aggregated metrics when you provide the same media in different formats.
    • “Videos per hour in website’s timezone” lets you find out how your media content is consumed depending on the hour of the day. You might realize that your media is consumed very differently in the morning vs at night.
    • “Video Resolutions” lets you discover how your video is consumed depending on the resolution.
    • “Media players” report is useful if you use different media players on your websites or apps and want to see how engagement with your media compares by media player.

    Row evolution

    At InnoCraft, we understand that static numbers are not so useful. When you see for example that yesterday 20 visitors played a certain media, would you know whether this is good or bad ? This is why we always give you the possibility to see the data in relation to the recorded data in the past. To see how a specific media performs over time, simply hover a media title or media resource URL and click on the “Row Evolution” icon.

    Now you can see whether actually more or less visitors played your chosen video for the selected period. Simply click on any metric name and the chosen metrics will be plotted in the big evolution graph.

    This feature is similar to the Media Overall evolution graph introduced in Part 1, but shows you a detailed evolution for an individual media title or resource.

    Media details

    Now that you know some of the most important media metrics, you might want to look a bit deeper into the user behaviour. For example we mentioned before the “Avg time spent on media” metric. Such an average number doesn’t let you know whether most visitors spent about the same time watching the video, or whether there were many more visitors that watched it only for a few seconds and a few that watched it for very long.

    One of the ways to get this insight is by again hovering any media title or resource URL and clicking on the “Media details” icon. It will open a new popup showing you a new set of reports like these :

    The “Time spent watching” and “How far visitors reached in the media” bar charts show you on the X-Axis how much time each visitor spent on watching a video and how far in the video they reached. On the Y-Axis you see the number of visitors. This lets you discover whether your users for example jump often to the middle or end of the video and which parts of your video was seen most often.

    The “How often the media was watched in a certain hour” and “Which resolutions the media was watched” is similar to the reports introduced in Part 1 of the blog post. However, this time instead of showing aggregated video or audio content data, they display data for a specific media title or media resource URL.

    Segmented audience log

    In Part 1 we have already introduced the Audience Log and explained that it is useful to better understand the user behaviour. Just a quick recap : The Audience Log shows you chronologically every action a specific visitor has performed on your website : Which pages they viewed, how they interacted with your media, when they clicked somewhere, and much more.

    By hovering a media title or a media resource and then selecting “Segmented audience log” you get to see the same log, but this time it will show only visitors that have interacted with the selected media. This will be useful for you for example when you notice an unusual value for a metric and then want to better understand why a metric is like that.

    Applying segments

    Media Analytics lets you apply any Piwik segment to the media reports allowing you to dice your visitors or personas multiplying the value that you get out of Media Analytics. For example you may want to apply a segment and analyze the media usage for visitors that have visited your website or mobile app for the first time vs. recurring visitors. Sometimes it may be interesting how visitors that converted a specific goal or purchased something consume your media, the possibilities are endless. We really recommend to take advantage of segments to understand your different target groups even better.

    The plugin also adds a lot of new segments to your Piwik letting you segment any Piwik report by visitors that have viewed or interacted with your media. For example you could go to the “Visitors => Devices” report and apply a media segment to see which devices were used the most to view your media. You can also combine segments to see for example how often your goals were converted when a visitor viewed media for longer than 10 seconds after waiting for at least 20 seconds before playing your media and when they played at least 3 videos during their visit.

    Widgets, Scheduled Reports, and more.

    This is not where the fun ends. Media Analytics defines more than 15 new widgets that you can add to your dashboard or export it into a third party website. You can set up Scheduled Reports to receive the Media reports automatically via email or sms or download the report to share it with your colleagues. It works also very well with Custom Alerts and you can view the Media reports in the Piwik Mobile app for Android and iOS. Via the HTTP Reporting API you can fetch any report in various formats. The plugin is really nicely integrated into Piwik we would need some more blog posts to fully cover all the ways Media Analytics advances your Piwik experience and how you can use and dig into all the data to increase your conversions and sales.

    How to get Media Analytics and related features

    You can get Media Analytics on the Piwik Marketplace. If you want to learn more about this feature, you might be also interested in the Media Analytics User Guide and the Media Analytics FAQ.

  • AWS lambda mp4 thumbnail generator using ffmpeg - incorrect format generated

    10 avril 2021, par sam bhandu

    I am trying to create a thumbnail generator for every mp4 file uploaded to the s3 bucket. I have been following this post published by AWS. The code works fine for the transcoding video file. I changed the code to generate a thumbnail. The code does generate a file but it is an invalid image type.

    


    import json
import os
import subprocess
import shlex
import boto3
import uuid

S3_DESTINATION_BUCKET = "example-bucket"
SIGNED_URL_TIMEOUT = 60

def lambda_handler(event, context):

    # s3_source_bucket = event['Records'][0]['s3']['bucket']['name']
    # s3_source_key = event['Records'][0]['s3']['object']['key']
    # s3_source_basename = os.path.splitext(os.path.basename(s3_source_key))[0]
    # s3_destination_filename = s3_source_basename + "_cfr.ts"
    
    hex_c = uuid.uuid4()
    s3_destination_filename = '/{}/{}.{}'.format('tmp',hex_c, 'jpg')
    s3_client = boto3.client('s3')
    s3_media_url = 'https://s3-us-west-2.amazonaws.com/example-bucket/videos/presentations/testing.mp4'
    ffmpeg_cmd = "/opt/bin/ffmpeg -i \"" + s3_media_url + "\" -ss 00:00:02 -vframes 1  \"" + s3_destination_filename + "\""
    # ffmpeg_cmd = "/opt/bin/ffmpeg -i \"" + s3_source_signed_url + "\" -f mpegts -c:v copy -af aresample=async=1:first_pts=0 -"
    
    command1 = shlex.split(ffmpeg_cmd)
    p1 = subprocess.run(command1, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
   
    resp = s3_client.put_object(Body=s3_destination_filename, Bucket=S3_DESTINATION_BUCKET, Key='{}{}'.format(hex_c, '.jpg'))
    return {
        'statusCode': 200,
        'body': json.dumps('Processing complete successfully')
    }


    


    Output is as :

    


    {
  "statusCode": 200,
  "body": "\"Processing complete successfully\""
}

Function Logs
START RequestId: b73aaacc-5da5-417a-9f98-5def438dee96 Version: $LATEST
ffmpeg version 4.1.3-static https://johnvansickle.com/ffmpeg/  Copyright (c) 2000-2019 the FFmpeg developers
  built with gcc 6.3.0 (Debian 6.3.0-18+deb9u1) 20170516
  configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc-6 --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gmp --enable-gray --enable-libaom --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-libsoxr --enable-libspeex --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzvbi --enable-libzimg
  libavutil      56. 22.100 / 56. 22.100
  libavcodec     58. 35.100 / 58. 35.100
  libavformat    58. 20.100 / 58. 20.100
  libavdevice    58.  5.100 / 58.  5.100
  libavfilter     7. 40.101 /  7. 40.101
  libswscale      5.  3.100 /  5.  3.100
  libswresample   3.  3.100 /  3.  3.100
  libpostproc    55.  3.100 / 55.  3.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'https://s3-us-west-2.amazonaws.com/example-bucket/videos/presentations/testing.mp4':
  Metadata:
    major_brand     : isom
    minor_version   : 1
    compatible_brands: isomavc1mp42
    creation_time   : 2020-04-17T18:31:33.000000Z
  Duration: 00:00:33.07, start: 0.000000, bitrate: 90 kb/s
    Stream #0:0(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, bt709), 854x480 [SAR 1:1 DAR 427:240], 23 kb/s, 30 fps, 30 tbr, 30 tbn, 60 tbc (default)
    Metadata:
      creation_time   : 2020-04-17T18:31:29.000000Z
    Stream #0:1(eng): Audio: aac (HE-AAC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 64 kb/s (default)
    Metadata:
      creation_time   : 2020-04-17T18:31:29.000000Z
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[swscaler @ 0x67ddc40] deprecated pixel format used, make sure you did set range correctly
Output #0, image2, to '/tmp/4633bb13-4a15-49b7-a445-d910bebaddf6.jpg':
  Metadata:
    major_brand     : isom
    minor_version   : 1
    compatible_brands: isomavc1mp42
    encoder         : Lavf58.20.100
    Stream #0:0(und): Video: mjpeg, yuvj420p(pc), 854x480 [SAR 1:1 DAR 427:240], q=2-31, 200 kb/s, 30 fps, 30 tbn, 30 tbc (default)
    Metadata:
      creation_time   : 2020-04-17T18:31:29.000000Z
      encoder         : Lavc58.35.100 mjpeg
    Side data:
      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame=    0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed=   0x    
frame=    0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed=   0x    
frame=    0 fps=0.0 q=0.0 size=N/A time=00:00:00.00 bitrate=N/A speed=   0x    
frame=    1 fps=0.4 q=6.3 Lsize=N/A time=00:00:00.03 bitrate=N/A speed=0.0149x    
video:14kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
END RequestId: b73aaacc-5da5-417a-9f98-5def438dee96
REPORT RequestId: b73aaacc-5da5-417a-9f98-5def438dee96  Duration: 6349.25 ms    Billed Duration: 6350 ms    Memory Size: 155 MB Max Memory Used: 123 MB Init Duration: 368.12 ms

Request ID
b73aaacc-5da5-417a-9f98-5def438dee96


    


    An image file is uploaded to the S3 folder, but when I try to open it flags an invalid file format. The file size is only 40.0 Bytes.
S3 bucket image folder
invalid file format

    


  • AWS Lambda : ffmpeg thumbnails Generator : empty JPG

    3 septembre 2020, par Magikey

    When a video is uploaded on S3 i want to store a JPG screenshot.

    



    On a lambda function with amazon AWS, i do :

    



    ...

  let tmpFile = createWriteStream(`/tmp/screenshot.jpg`)

  var ffmpeg = spawn(ffmpegPath, [
      "-ss","00:00:05",
      "-i", target,
      "-vf", "thumbnail,scale=200:200", 
      "-qscale:v" ,"2",
      "-frames:v", "1",
      "-f", "image2",
      "-c:v", "mjpeg",
      "pipe:1"
    ]);

  ffmpeg.stdout.pipe(tmpFile).on("error", err => {
      console.log("Error A: ",err);
    });

  ffmpeg.on('error', err => {
    console.log("Error B", err)
    reject()
  })

  ffmpeg.on('close', code => {
    tmpFile.end();
    console.log('Log A', ffmpeg);

    child_process.exec("echo `ls -l -R /tmp`",
      (error, stdout, stderr) => {
        console.log(stdout)
    });

    resolve()
  })
...


    



    But the result is an empty JPG file in S3.

    



    Logs shows no errors, my "target" is OK, stdout ls show me the empty JPG file.

    



    I have try a lot of things, like use other version of ffmpeg but same.

    



    There is the "console.log('Log A', ffmpeg)" :

    



    ChildProcess {
 _events: [Object: null prototype] { error: [Function], close: [Function] },
 _eventsCount: 2,
 _maxListeners: undefined,
 _closesNeeded: 3,
 _closesGot: 3,
 connected: false,
 signalCode: 'SIGSEGV',
 exitCode: null,
 killed: false,
 spawnfile: '/opt/nodejs/ffmpeg',
 _handle: null,
 spawnargs: [
   '/opt/nodejs/ffmpeg',
   '-ss',
   '00:00:05',
   '-i',
   'https://xxxxxxxxx',
   '-vf',
   'thumbnail,scale=200:200',
   '-qscale:v',
   '2',
   '-frames:v',
   '1',
   '-f',
   'image2',
   '-v',
   '16',
   '-c:v',
   'mjpeg',
   'pipe:1'
 ],
 pid: 24,
 stdin: Socket {
   connecting: false,
   _hadError: false,
   _parent: null,
   _host: null,
   _readableState: ReadableState {
     objectMode: false,
     highWaterMark: 16384,
     buffer: BufferList { head: null, tail: null, length: 0 },
     length: 0,
     pipes: null,
     pipesCount: 0,
     flowing: null,
     ended: false,
     endEmitted: false,
     reading: false,
     sync: true,
     needReadable: false,
     emittedReadable: false,
     readableListening: false,
     resumeScheduled: false,
     paused: true,
     emitClose: false,
     autoDestroy: false,
     destroyed: true,
     defaultEncoding: 'utf8',
     awaitDrain: 0,
     readingMore: false,
     decoder: null,
     encoding: null
   },
   readable: false,
   _events: [Object: null prototype] { end: [Function: onReadableStreamEnd] },
   _eventsCount: 1,
   _maxListeners: undefined,
   _writableState: WritableState {
     objectMode: false,
     highWaterMark: 16384,
     finalCalled: false,
     needDrain: false,
     ending: false,
     ended: false,
     finished: false,
     destroyed: true,
     decodeStrings: false,
     defaultEncoding: 'utf8',
     length: 0,
     writing: false,
     corked: 0,
     sync: true,
     bufferProcessing: false,
     onwrite: [Function: bound onwrite],
     writecb: null,
     writelen: 0,
     bufferedRequest: null,
     lastBufferedRequest: null,
     pendingcb: 0,
     prefinished: false,
     errorEmitted: false,
     emitClose: false,
     autoDestroy: false,
     bufferedRequestCount: 0,
     corkedRequestsFree: [Object]
   },
   writable: false,
   allowHalfOpen: false,
   _sockname: null,
   _pendingData: null,
   _pendingEncoding: '',
   server: null,
   _server: null,
   [Symbol(asyncId)]: 5,
   [Symbol(kHandle)]: null,
   [Symbol(lastWriteQueueSize)]: 0,
   [Symbol(timeout)]: null,
   [Symbol(kBuffer)]: null,
   [Symbol(kBufferCb)]: null,
   [Symbol(kBufferGen)]: null,
   [Symbol(kBytesRead)]: 0,
   [Symbol(kBytesWritten)]: 0
 },
 stdout: Socket {
   connecting: false,
   _hadError: false,
   _parent: null,
   _host: null,
   _readableState: ReadableState {
     objectMode: false,
     highWaterMark: 16384,
     buffer: BufferList { head: null, tail: null, length: 0 },
     length: 0,
     pipes: null,
     pipesCount: 0,
     flowing: false,
     ended: true,
     endEmitted: true,
     reading: false,
     sync: false,
     needReadable: false,
     emittedReadable: false,
     readableListening: false,
     resumeScheduled: false,
     paused: false,
     emitClose: false,
     autoDestroy: false,
     destroyed: true,
     defaultEncoding: 'utf8',
     awaitDrain: 0,
     readingMore: false,
     decoder: null,
     encoding: null
   },
   readable: false,
   _events: [Object: null prototype] {
     end: [Function: onReadableStreamEnd],
     close: [Function]
   },
   _eventsCount: 2,
   _maxListeners: undefined,
   _writableState: WritableState {
     objectMode: false,
     highWaterMark: 16384,
     finalCalled: false,
     needDrain: false,
     ending: false,
     ended: false,
     finished: false,
     destroyed: true,
     decodeStrings: false,
     defaultEncoding: 'utf8',
     length: 0,
     writing: false,
     corked: 0,
     sync: true,
     bufferProcessing: false,
     onwrite: [Function: bound onwrite],
     writecb: null,
     writelen: 0,
     bufferedRequest: null,
     lastBufferedRequest: null,
     pendingcb: 0,
     prefinished: false,
     errorEmitted: false,
     emitClose: false,
     autoDestroy: false,
     bufferedRequestCount: 0,
     corkedRequestsFree: [Object]
   },
   writable: false,
   allowHalfOpen: false,
   _sockname: null,
   _pendingData: null,
   _pendingEncoding: '',
   server: null,
   _server: null,
   write: [Function: writeAfterFIN],
   [Symbol(asyncId)]: 6,
   [Symbol(kHandle)]: null,
   [Symbol(lastWriteQueueSize)]: 0,
   [Symbol(timeout)]: null,
   [Symbol(kBuffer)]: null,
   [Symbol(kBufferCb)]: null,
   [Symbol(kBufferGen)]: null,
   [Symbol(kBytesRead)]: 0,
   [Symbol(kBytesWritten)]: 0
 },
 stderr: Socket {
   connecting: false,
   _hadError: false,
   _parent: null,
   _host: null,
   _readableState: ReadableState {
     objectMode: false,
     highWaterMark: 16384,
     buffer: BufferList { head: null, tail: null, length: 0 },
     length: 0,
     pipes: null,
     pipesCount: 0,
     flowing: null,
     ended: true,
     endEmitted: true,
     reading: false,
     sync: false,
     needReadable: false,
     emittedReadable: false,
     readableListening: false,
     resumeScheduled: false,
     paused: true,
     emitClose: false,
     autoDestroy: false,
     destroyed: true,
     defaultEncoding: 'utf8',
     awaitDrain: 0,
     readingMore: false,
     decoder: null,
     encoding: null
   },
   readable: false,
   _events: [Object: null prototype] {
     end: [Function: onReadableStreamEnd],
     close: [Function]
   },
   _eventsCount: 2,
   _maxListeners: undefined,
   _writableState: WritableState {
     objectMode: false,
     highWaterMark: 16384,
     finalCalled: false,
     needDrain: false,
     ending: false,
     ended: false,
     finished: false,
     destroyed: true,
     decodeStrings: false,
     defaultEncoding: 'utf8',
     length: 0,
     writing: false,
     corked: 0,
     sync: true,
     bufferProcessing: false,
     onwrite: [Function: bound onwrite],
     writecb: null,
     writelen: 0,
     bufferedRequest: null,
     lastBufferedRequest: null,
     pendingcb: 0,
     prefinished: false,
     errorEmitted: false,
     emitClose: false,
     autoDestroy: false,
     bufferedRequestCount: 0,
     corkedRequestsFree: [Object]
   },
   writable: false,
   allowHalfOpen: false,
   _sockname: null,
   _pendingData: null,
   _pendingEncoding: '',
   server: null,
   _server: null,
   write: [Function: writeAfterFIN],
   [Symbol(asyncId)]: 7,
   [Symbol(kHandle)]: null,
   [Symbol(lastWriteQueueSize)]: 0,
   [Symbol(timeout)]: null,
   [Symbol(kBuffer)]: null,
   [Symbol(kBufferCb)]: null,
   [Symbol(kBufferGen)]: null,
   [Symbol(kBytesRead)]: 0,
   [Symbol(kBytesWritten)]: 0
 },
 stdio: [
   Socket {
     connecting: false,
     _hadError: false,
     _parent: null,
     _host: null,
     _readableState: [ReadableState],
     readable: false,
     _events: [Object: null prototype],
     _eventsCount: 1,
     _maxListeners: undefined,
     _writableState: [WritableState],
     writable: false,
     allowHalfOpen: false,
     _sockname: null,
     _pendingData: null,
     _pendingEncoding: '',
     server: null,
     _server: null,
     [Symbol(asyncId)]: 5,
     [Symbol(kHandle)]: null,
     [Symbol(lastWriteQueueSize)]: 0,
     [Symbol(timeout)]: null,
     [Symbol(kBuffer)]: null,
     [Symbol(kBufferCb)]: null,
     [Symbol(kBufferGen)]: null,
     [Symbol(kBytesRead)]: 0,
     [Symbol(kBytesWritten)]: 0
   },
   Socket {
     connecting: false,
     _hadError: false,
     _parent: null,
     _host: null,
     _readableState: [ReadableState],
     readable: false,
     _events: [Object: null prototype],
     _eventsCount: 2,
     _maxListeners: undefined,
     _writableState: [WritableState],
     writable: false,
     allowHalfOpen: false,
     _sockname: null,
     _pendingData: null,
     _pendingEncoding: '',
     server: null,
     _server: null,
     write: [Function: writeAfterFIN],
     [Symbol(asyncId)]: 6,
     [Symbol(kHandle)]: null,
     [Symbol(lastWriteQueueSize)]: 0,
     [Symbol(timeout)]: null,
     [Symbol(kBuffer)]: null,
     [Symbol(kBufferCb)]: null,
     [Symbol(kBufferGen)]: null,
     [Symbol(kBytesRead)]: 0,
     [Symbol(kBytesWritten)]: 0
   },
   Socket {
     connecting: false,
     _hadError: false,
     _parent: null,
     _host: null,
     _readableState: [ReadableState],
     readable: false,
     _events: [Object: null prototype],
     _eventsCount: 2,
     _maxListeners: undefined,
     _writableState: [WritableState],
     writable: false,
     allowHalfOpen: false,
     _sockname: null,
     _pendingData: null,
     _pendingEncoding: '',
     server: null,
     _server: null,
     write: [Function: writeAfterFIN],
     [Symbol(asyncId)]: 7,
     [Symbol(kHandle)]: null,
     [Symbol(lastWriteQueueSize)]: 0,
     [Symbol(timeout)]: null,
     [Symbol(kBuffer)]: null,
     [Symbol(kBufferCb)]: null,
     [Symbol(kBufferGen)]: null,
     [Symbol(kBytesRead)]: 0,
     [Symbol(kBytesWritten)]: 0
   }
 ]
} ```