ffmpeg

FFMPEG

FFMPEG exemple script visual

Original article : https://lukaprincic.si/development-log/ffmpeg-audio-visualization-tricks

Often one wants to share audio online, but it seems like video as a format has many more options: mastodon, twitter, facebook, youtube, all allow to upload video, but not only audio. Here are some ffmpeg tricks, how to add interesting video to your audio file, often autogenerated visuals. All the code is supposed to be used as one line without line-breaks.<

Audio Vector Scope

ffmpeg -i INPUT_AUDIO.wav -filter_complex 
"[0:a]avectorscope=s=480x480:zoom=1.5:rc=0:gc=200:bc=0:rf=0:gf=40:bf=0,format=yuv420p[v]; 
 [v]pad=854:480:187:0[out]" 
 -map "[out]" -map 0:a 
-b:v 700k -b:a 360k 
OUTPUT_VIDEO.mp4

The code above creates a mp4 video file with a vectorscope nicely centered inside a 854×480 (480p) video. If you need a 1:1 video, just exclude the pad part:

ffmpeg -i INPUT_AUDIO.wav -filter_complex 
"[0:a]avectorscope=s=480x480:zoom=1.5:rc=0:gc=200:bc=0:rf=0:gf=40:bf=0,format=yuv420p[v]" 
 -map "[v]" -map 0:a 
-b:v 700k -b:a 360k 
OUTPUT_VIDEO.mp4

Documentation on ‘avectorscope’ filter is here: https://ffmpeg.org/ffmpeg-filters.html#avectorscope. One can play with zoom and other options to produce desired form.

Show waves

ffmpeg -i INPUT.wav -filter_complex 
"[0:a]showwaves=mode=line:s=hd480:colors=White[v]" 
-map "[v]" -map 0:a -pix_fmt yuv420p 
-b:a 360k -r:a 44100  OUTPUT.mp4

more options: http://www.ffmpeg.org/ffmpeg-filters.html#showwaves

Showspectrum

v ffmpeg -i INPUT.wav -filter_complex “[0:a]showspectrum=s=854×480:mode=combined:slide=scroll:saturation=0.2:scale=log,format=yuv420p[v]” -map “[v]” -map 0:a -b:v 700k -b:a 360k OUTPUT.mp4 </code>

Above code will create almost completely desaturated spectrum of the audio sliding from right to left. Again, there are various options to tweak, see here: https://ffmpeg.org/ffmpeg-filters.html#showspectrum-1

Histogram

ffmpeg -i INPUT.wav -filter_complex 
"[0:a]ahistogram=s=hd480:slide=scroll:scale=log,format=yuv420p[v]" 
 -map "[v]" -map 0:a 
 -b:a 360k OUTPUT.mp4
more options: http://www.ffmpeg.org/ffmpeg-filters.html#showwaves

Static spectrogram

Sometimes you want to just create a static image.

## create a spectrogram as a single frame

ffmpeg -i INPUT.wav -lavfi 
showspectrumpic=s=hd480:legend=0,format=yuv420p 
SPECTROGRAM.png
## add png to audio - you need to know the length of audio
ffmpeg -loop 1 -i SPECTROGRAM.png -i INPUT.wav 
-s hd480 -t 00:01:00 -pix_fmt yuv420p 
-b:a 360k -r:a 44100 OUTPUT.mp4

## create a spectrogram as a single frame

ffmpeg -i INPUT.wav -lavfi 
showspectrumpic=s=hd480:legend=0,format=yuv420p 
SPECTROGRAM.png
## add png to audio - you need to know the length of audio
ffmpeg -loop 1 -i SPECTROGRAM.png -i INPUT.wav 
-s hd480 -t 00:01:00 -pix_fmt yuv420p 
-b:a 360k -r:a 44100 OUTPUT.mp4

Above one is in two steps. More info here: http://www.ffmpeg.org/ffmpeg-filters.html#showspectrumpic

text overlays

ffmpeg -i video1.mp4 -i video2.mp4 \
  -filter_complex "[0:v:0] [0:a:0] [0:v:1] [0:a:1] concat=n=2:v=1:a=1 [v][a];
[v]drawtext=text='SOME TEXT':x=(w-text_w):y=(h-text_h):fontfile=OpenSans.ttf:fontsize=30:fontcolor=white[v]" \
  -map "[v]" -map "[a]" -deinterlace \
  -vcodec libx264 -pix_fmt yuv420p -preset $QUAL -r $FPS -g $(($FPS * 2)) -b:v $VBR \
  -acodec libmp3lame -ar 44100 -threads 6 -qscale 3 -b:a 712000 -bufsize 512k \
  -f flv "$YOUTUBE_URL/$KEY"

# Print file metadata etc.

   ffmpeg -i path/to/file.ext

# Convert all m4a files to mp3

   for f in *.m4a; do ffmpeg -i "$f" -acodec libmp3lame -vn -b:a 320k "${f%.m4a}.mp3"; done

# Convert video from .foo to .bar -g : GOP, for searchability

ffmpeg -i input.foo -vcodec bar -acodec baz -b:v 21000k -b:a 320k -g 150 -threads 4 output.bar 

# Convert image sequence to video

     ffmpeg -r 18 -pattern_type glob -i '*.png' -b:v 21000k -s hd1080 -vcodec vp9 -an -pix_fmt yuv420p -deinterlace output.ext

# Combine video and audio into 1 file

     ffmpeg -i video.ext -i audio.ext -c:v copy -c:a copy output.ext

# Listen to 10 seconds of audio from a video file -ss : start time / -t : seconds to cut / -autoexit : closes ffplay as soon as the audio finishes

ffmpeg -ss 00:34:24.85 -t 10 -i path/to/file.mp4 -f mp3 pipe:play | ffplay -i pipe:play -autoexit


# Common switches ```bash

  1. codecs # list codecs
  2. c:v # video codec (-vcodec) - 'copy' to copy stream
  3. c:a # audio codec (-acodec)

-fs SIZE # limit file size (bytes) ```

# Bitrate

```bash -b:v 1M # video bitrate (1M = 1Mbit/s) -b:a 1M # audio bitrate ```

# Video

```bash -aspect RATIO # aspect ratio (4:3, 16:9, or 1.25) -r RATE # frame rate per sec -s WIDTHxHEIGHT # frame size -vn # no video ```

# Audio

```bash -aq QUALITY # audio quality (codec-specific) -ar 44100 # audio sample rate (hz) -ac 1 # audio channels (1=mono, 2=stereo) -an # no audio -vol N # volume (256=normal) ```

## Example ### Ringtone conversion using ffmpeg

```bash ffmpeg -i foo.mp3 -ac 1 -ab 128000 -f mp4 -acodec libfaac -y target.m4r ```

### To web

# no audio

ffmpeg -i input.mov -vcodec h264   -an -strict -2 output.mp4
ffmpeg -i input.mov -vcodec libvpx -an output.webm

ffmpeg -i input.mov -vcodec h264 -acodec aac -strict -2 output.mp4
ffmpeg -i input.mov -vcodec libvpx -acodec libvorbis output.webm

```html <video width=“320” height=“240” controls>

<source src="movie.mp4" type='video/mp4'></source>
<source src="movie.webm" type='video/ogg'></source>

</video> ```

#ffmpeg generate overlay complex filter for audio with background image

    ffmpeg -y -i input.mp3 -loop 1 -i background.png \
    -filter_complex "[0:a]showwaves=s=1280x720:mode=line,colorkey=0x000000:0.01:0.1,format=yuva420p[v];[1:v][v]overlay[outv]" \
    -map "[outv]" -pix_fmt yuv420p -map 0:a -c:v libx264 -c:a copy -shortest output.mp

[https://www.youtube.com/watch?v=zKYzJ_bEJVo](https://www.youtube.com/watch?v=zKYzJ_bEJVo)

ffmpeg -y -i audio.mp3 -loop 1 -i image.jpg \
    -filter_complex "[0:a]showwaves=s=1280x175:colors=Yellow:mode=line,format=yuv420p[v];[1:v][v]overlay=0:200[outv]" \
    -map "[outv]" -pix_fmt yuv420p -map 0:a -c:v libx264 -c:a copy -shortest output9.mp4

[https://www.youtube.com/watch?v=1htjI7YSNZo](https://www.youtube.com/watch?v=1htjI7YSNZo)

ffmpeg -y -i audio.mp3 -loop 1 -i image.jpg \
    -filter_complex "[0:a]showwaves=s=1280x720:mode=line,colorkey=0x000000:0.01:0.1,format=yuva420p[v];[1:v][v]overlay[outv]" \
    -map "[outv]" -pix_fmt yuv420p -map 0:a -c:v libx264 -c:a copy -shortest output1.mp4
[https://www.youtube.com/watch?v=H5dCKuN9Ius](https://www.youtube.com/watch?v=H5dCKuN9Ius)

ffmpeg -y -i audio.mp3 -loop 1 -i image.jpg \
    -filter_complex "[0:a]showwaves=s=1280x175:colors=White:mode=p2p,format=yuv420p[v];[1:v][v]overlay=0:200[outv]" \
    -map "[outv]" -pix_fmt yuv420p \
    -map 0:a -c:v libx264 -c:a copy -shortest output12.mp4

  • ffmpeg.txt
  • Dernière modification : 2022/11/08 16:43
  • de 127.0.0.1