/media/bill/HOWELL_BASE/System_maintenance/Linux/cat, join, paste notes.txt www.BillHowell.ca 07Aug2017 initial 08********08 22Jul2021 catenate .mp4 files 22J2021 - simple cat does NOT work - cat file has full size, but video stops at end of 1st segment see "$d_bin""IJCNN mark paper title lines.sh" - mark lines with paper fname Key issue- paths must be "spelled out : use : '/media/bill/Dell2/PROJECTS/2021 IJCNN Shenzen, China/plenaries/PleMon Riitta Salmelin, What neuroimaging can tell about human brain function.mp4' NOT : "$d_conf"'PleMon Riitta Salmelin, What neuroimaging can tell about human brain function.mp4' Don't get "file " start of "$d_temp""cat_Plenaries file list.txt" +-----+ https://superuser.com/questions/521113/join-mp4-files-in-linux The best way to do this currently is with the concat demuxer. First, create a file called inputs.txt formatted like so: file '/path/to/input1.mp4' file '/path/to/input2.mp4' file '/path/to/input3.mp4' Then, simply run this ffmpeg command: ffmpeg -f concat -i inputs.txt -c copy output.mp4 See also concatenation in ffmpeg FAQ. edited Feb 13 '20 at 11:24 answered Dec 20 '12 at 4:26 evilsoup +--+ Here is a one-liner that makes it all that much easier. It does everything needed to be done in order to finally provide you with one resulting video file. Cheers! find *.mp4 | sed 's:\ :\\\ :g'| sed 's/^/file /' > list.txt; ffmpeg -f concat -i list.txt -c copy output.mp4; rm list.txts answered Apr 6 '19 at 21:58 Anonymous --+ http://www.ffmpeg.org/faq.html#Concatenating-using-the-concat-demuxer 3.14.3 Concatenating using the concat protocol (file level) FFmpeg has a concat protocol designed specifically for that, with examples in the documentation. A few multimedia containers (MPEG-1, MPEG-2 PS, DV) allow one to concatenate video by merely concatenating the files containing them. Hence you may concatenate your multimedia files by first transcoding them to these privileged formats, then using the humble cat command (or the equally humble copy under Windows), and finally transcoding back to your format of choice. ffmpeg -i input1.avi -qscale:v 1 intermediate1.mpg ffmpeg -i input2.avi -qscale:v 1 intermediate2.mpg cat intermediate1.mpg intermediate2.mpg > intermediate_all.mpg ffmpeg -i intermediate_all.mpg -qscale:v 2 output.avi 08********08 14Jul2021 search "Linux cat and how do I skip the first line?" https://stackoverflow.com/questions/604864/print-a-file-skipping-the-first-x-lines-in-bash +--+ You'll need tail. Some examples: $ tail great-big-file.log < Last 10 lines of great-big-file.log > If you really need to SKIP a particular number of "first" lines, use $ tail -n + < filename, excluding first N lines. > That is, if you want to skip N lines, you start printing line N+1. Example: $ tail -n +11 /tmp/myfile < /tmp/myfile, starting at line 11, or skipping the first 10 lines. > If you want to just see the last so many lines, omit the "+": $ tail -n < last N lines of file. > edited Dec 14 '18 at 23:27 rogerdpack 51.5k3131 gold badges216216 silver badges338338 bronze badges answered Mar 3 '09 at 2:24 SingleNegationElimination Or "tail --lines=+ ..." for the readable-commands crowd :-) – paxdiablo Mar 3 '09 at 2:34 +--+ Easiest way I found to remove the first ten lines of a file: $ sed 1,10d file.txt In the general case (where X is the number of initial lines to delete, credit to commenters and editors for this): $ sed 1,Xd file.txt edited Dec 27 '20 at 18:03 answered Oct 17 '12 at 7:17 David Parks +--+ Use the sed delete command with a range address. For example: sed 1,100d file.txt # Print file.txt omitting lines 1-100. Alternatively, if you want to only print a known range, use the print command with the -n flag: sed -n 201,300p file.txt # Print lines 201-300 from file.txt This solution should work reliably on all Unix systems, regardless of the presence of GNU utilities. edited Apr 13 '20 at 2:31 Peter Mortensen 28.6k2121 gold badges9595 silver badges123123 bronze badges answered Dec 2 '16 at 16:19 maerics ******** 20Sep2020 join command looks like a great illustration of join : https://www.howtogeek.com/542677/how-to-use-the-join-command-on-linux/How to Use the join command on Linux Dave McKay, @TheGurkha February 19, 2020, 8:00am EDT ********************** 07Aug2017 15:22 pipe one command into another https://stackoverflow.com/questions/864316/how-to-pipe-list-of-files-returned-by-find-command-to-cat-to-view-all-the-files I am doing a find and then getting a list of files. How do I pipe it to another utility like cat (so that cat displays the contents of all those files) and basically need to grep something from these files. unix find pipe edited Oct 1 '13 at 15:12, Jonathan Leffler asked May 14 '09 at 16:18, Devang Kamdar 1. Piping to another process (Although this WON'T accomplish what you said you are trying to do): command1 | command2 This will send the output of command1 as the input of command2 2. -exec on a find (this will do what you are wanting to do -- but is specific to find) find . -name '*.foo' -exec cat {} \; (Everything between find and -exec are the find predicates you were already using. {} will substitute the particular file you found into the command (cat {} in this case); the \; is to end the -exec command.) 3. send output of one process as command line arguments to another process command2 `command1` for example: cat `find . -name '*.foo' -print` (Note these are BACK-QUOTES not regular quotes (under the tilde ~ on my keyboard).) This will send the output of command1 into command2 as command line arguments. Note that file names containing spaces (newlines, etc) will be broken into separate arguments, though. edited Nov 24 '11 at 19:33, Jonathan Leffler answered May 14 '09 at 16:30, kenj0418 +-----+ cat find -name '*.foo' -print worked great for me ... Thanks – Devang Kamdar May 15 '09 at 13:24 The backquotes work great and is more generalized, you can use this to cat a list of files from a file as well. – Hazok Sep 2 '11 at 18:31 Note that modern versions of find allow you to write: find . -name '*.foo' -exec cat {} +, where the + indicates that find should group as many file names as convenient into a single command invocation. This is quite useful (it deals with spaces etc in file names without resorting to -print0 and xargs -0). – Jonathan Leffler Nov 24 '11 at 19:29 the exec() also works for grep (find . -name '*.foo' -exec grep bar {} \;) – Mike Pone Sep 26 '12 at 19:39 Unmentioned: find . -name '*.foo' | xargs cat – stewSquared Aug 24 '16 at 20:16 +-----+ There are a few ways to pass the list of files returned by the find command to the cat command, though technically not all use piping, and none actually pipe directly to cat. The simplest is to use backticks: cat `find [whatever]` Equivalently, can use $() instead of backticks in some shells, including bash: cat $(find [whatever]) This is less portable, but is nestable. Both syntaxes take the output of find and puts it on cat's command-line. It doesn't work well if find has too much output (more than can fit on a command-line) or if the output has special characters (like spaces). You can use find's -exec action, which executes a command for each file it finds: find [whatever] -exec cat {} \; This will run cat once for every single file rather than running a single instance of cat passing it multiple filenames which can be inefficient and might not have the behavior you want for some commands (though it's fine for cat). The syntax is also a bit annoying. (You need to escape the semicolon because semicolon is special to the shell!) Some versions of find (most notably the GNU version) let you replace ; with + to use find's append mode to run fewer instances of cat. find [whatever] -exec cat {} + This will pass multiple filenames to each invocation of cat, which can be more efficient. Note that it is not guaranteed to use a single invocation, however. If the command line would be too long then the arguments are spread across multiple invocations of cat. (On Linux systems, the command line length limit is quite large, so this isn't typically an issue.) The classic/portable approach is to use xargs: find [whatever] | xargs cat xargs runs the command specified (cat, in this case), and adds arguments based on what it reads from stdin. Just like -exec with +, this will break up the command-line if necessary. That is, if find produces too much output, it'll run cat multiple times. (like the note about -exec earlier, there are some commands where this splitting may result in different behavior) Note that using xargs like this has issues with spaces in filenames, as xargs just uses whitespace as a delimiter. The most robust, portable, and efficient method also uses xargs: find [whatever] -print0 | xargs -0 cat The -print0 flag tells find to use \0 (null character) delimiters between filenames, and the -0 flag tells xargs to expect these \0 delimiters. This has pretty much identical behavior to the -exec...+ approach, though is more portable (and also a bit more verbose, obviously). shareimprove this answer edited Jul 26 at 19:45 answered May 14 '09 at 16:29 Laurence Gonsalves 87.3k22161219 add a comment up vote 8 down vote To achieve this (using bash) I would do as follows: cat $(find . -name '*.foo') This is known as the the "command substitution" and it strips line feed by default which is really convinient ! more infos here shareimprove this answer edited Sep 27 '12 at 7:55 answered Sep 12 '12 at 16:27 Stphane +-----+ Sounds like a job for a shell script to me: for file in 'find -name *.xml' do grep 'hello' file done or something like that shareimprove this answer answered May 14 '09 at 16:21 Gandalf This is a valid, though not necessarily optimal, answer to the question. – Jonathan Leffler May 14 '09 at 20:16 1 ...yeah but it is great if you want one big file with filenames listed as well. – ʍǝɥʇɐɯ Jun 11 '11 at 19:49 1 I like this the best. A loop block like this leaves room to do other things. – kakyo Aug 20 '14 at 21:38 # enddoc