DEV Community

István Lantos
István Lantos

Posted on

What are your UNIX pipeline commands that saved you from lot of coding/time?

This question came to my mind last day when I wanted to free up some space on my HDDs and when it comes to what to delete (feel the pain), I wanted to know without lot of right-clicks which folders are the fattiest. As a good Windows user, I installed Total Commander, because some random Google result told me to do so.

Then I realised, heck, I have an entire UNIX environment on my PC (MSYS2), so maybe there is an utterly simple one liner command for achieve this. And guess what, it has: :)

$ du -sh * | sort -h 
Enter fullscreen mode Exit fullscreen mode

Life hack, place this in your .bashrc file:

# Within the current directory, list all folders and files # and print their size, then sort it from smallest to largest: alias du='du -sh * | sort -h' 
Enter fullscreen mode Exit fullscreen mode

What are your UNIX pipeline commands, where you can combine some program's standard output and creating something wicked simple time saver?

Top comments (42)

Collapse
 
val_baca profile image
Valentin Baca • Edited

The most common pipes I use are:

grep # ALL HAIL. Should know these: grep -F # fixed/literal mode, use to search exactly what you type, no regex grep -i # case insensitive grep -v # reverse, shows only lines that do NOT match grep -w # matches on "words" only, grep -w "lag" won't match "flag" sort # usually combine with uniq uniq # usually combined with sort wc -l # count lines head -n 10 # top 10! you won't believe number 7! tail -f # tail with follow, useful for log that are being updated, such as: tail -f | grep -i error cut # to only get certain columns, for example: cut -d' ' -f12-15 # split on spaces, only show columns 12 through 15 # or use gawk when dealing with buffers gawk '{$12,$13,$14,$15}' 

Here's a real command I'm using in a demo in 5 minutes to show something that's hidden deep in our logs:

tail -f our.log | grep -w "WARNING" | grep --color=always -F "something=" | cut -d' ' -f12-15 

EDIT: Unfortunately that doesn't work because of file buffering. Here's an actual working command for a log that is getting updated in live, but it does effectively the same thing:

tail -f our.log| grep --line-buffered -w "WARNING" | grep --color=always -F --line-buffered "something=" | gawk '{print $12,$13,$14,$15}' 

Bonus: Here's a super evil one >:)

$ yes > /dev/null & # does nothing but waste CPU 
Collapse
 
tuomotanskanen profile image
Tuomo Tanskanen

sort -u to optimize sort | uniq ;)

Collapse
 
lilorox profile image
Pierre

Unless you need to count the unique results: blah | sort -n | uniq -c

Collapse
 
gregorybodnar profile image
Greg Bodnar

Cheers for the tip about grep options. I do a lot of grepping, but I think those options will save me heaps of time in the future.

Collapse
 
jorinvo profile image
jorin • Edited

I think process substitution is incredible helpful. Took me a few years to find out about it.
A super powerful command is xargs. Also make sure to have a look at the -I and -p flags, both really powerful.
I just used these two features as part of an article I wrote,
Automate Your Mac Setup and Keep It Up to Date
Compare the output of two commands, get the result list and pass in one line to another command:

comm -13 <(sort brew.txt) <(brew leaves | sort) | xargs brew rm 

And some aliases I like:

# Start a webserver, also accepts a port as optional argument alias server='python3 -m http.server' # Copy to clipboard on Mac or Linux alias copy="$(which pbcopy &> /dev/null && echo pbcopy || echo 'xclip -sel clip')" # Pipe my public key to my clipboard. alias pubkey="more ~/.ssh/id_rsa.pub | copy && echo '=> Public key copied to pasteboard.'" 

If you are curious which commands you use a lot, you can find out like this:
What Are Your Most Used Shell Commands?

Collapse
 
ben profile image
Ben Halpern

Oooh I feel like this is a weakness of mine. Looking forward to the responses

Collapse
 
r0f1 profile image
Florian Rohrer

The Just Commit Everything command:

alias gitasdf='git add -A && git commit -m asdf && git push' 
Collapse
 
benjaminconnelly profile image
Benjamin Connelly • Edited

Having a shortcut to add and commit is priceless. I use this function with a useful commit message and commit often.

$lg "My useful description" 
# Lazy Git function lg() { git add . git commit -a -m "$1" git push } # End Lazy Git 
Collapse
 
tasmo profile image
Tasmo

This might be very dangerous if you don't git pull after your commit. So I suggest:

# Lazy Git function lg() { git add . git commit -a -m "$1" git pull git push } # End Lazy Git 
Thread Thread
 
benjaminconnelly profile image
Benjamin Connelly

Thank you! I have updated my function.

Collapse
 
djviolin profile image
István Lantos

I have something similar, but I'm inserting a timestamp as a commit message. Still totally useless. :)

I also replaced origin word in the git repo creation at git remote add gitlab .... So if I pushing something to gitlab, I use the gitlab command, which using git push -u gitlab --all. I have similar to Github.

Collapse
 
msoedov profile image
Alex Miasoiedov

Why do you use git? Just use dropbox

Collapse
 
lucianoq profile image
Luciano Quercia • Edited
alias ctrlc='xclip -selection clipboard -i' alias ctrlv='xclip -selection clipboard -o' 

Examples:

# copy something without opening editors or showing it in terminal cat ~/.ssh/mykey.pub | ctrlc # paste clipboard somewhere ctrlv >> ~/.ssh/known_hosts ctrlv | tr '[:upper:]' '[:lower:]' ctrlv | tr -d "\n" ctrlv | base64 -d ctrlv | base64 ctrlv | md5sum cat myfile | base64 | ctrlc cat myfile | md5sum | ctrlc ctrlv | base64 | ctrlc ctrlv | base64 -d | ctrlc 
Collapse
 
dvdmuckle profile image
David Muckle • Edited

xargs is amazing for just about anything that doesn't like piped results. Want to pull a bunch of Docker images in one command?

echo "ubuntu gcc alpine nginx" | xargs -n 1 docker pull 
Collapse
 
bertvv profile image
Bert Van Vreckem

So if I understand correctly, this is equivalent to the following?

for img in ubuntu gcc alpine nginx; do docker pull "${img}"; done 

I never thought of using xargs this way, good to know!

I guess you could parallellize the loop by adding -P 0 to the xargs invocation.

Collapse
 
dvdmuckle profile image
David Muckle • Edited

I didn't consider using parallelization, that's a great idea! Though, what happens with Docker if it pulls in two copies of the same image at once? If two images have the same dependency, will Docker deal with this parallel pull fine, or will it bork?

EDIT: Docker seems to handle this rather well; It just queues up the layer pulls in some arbitrary order. Still, I don't know if I entirely trust this, but I can't see a reason for this to not work (And I can see a reason that one might want this to work!).

Collapse
 
djviolin profile image
István Lantos

This is cool! I never thought about that.

Collapse
 
erikdstock profile image
Erik

Some dumb and possibly dangerous variations I use:

# A blunt tool to fix changed ruby files before a commit alias precommitfix="git status -s | awk '{print $2}' | grep .rb$ | xargs bundle exec rubocop -a" # Delete merged remote branches (remove --dry-run after verifying) git branch -r --merged | \ grep origin | \ grep -v '>' | \ grep -v master | \ xargs -L1 | \ awk '{sub(/origin\//,"");print}'| \ xargs git push origin --delete --dry-run 
Collapse
 
barjo profile image
Jonathan Bardin

I often use this one get the pid of the parents of zombie process.

ps -A -ostat,ppid | awk '/[zZ]/{print $2}' 

Apart from that I often use grep, sort -u, sed.

A funny one is to pipe fortune with cowsay

fortune -a | cowsay -f vader 
Collapse
 
jonasbn profile image
Jonas Brømsø

I have used perl in pipes a lot for when you need to change something easily using a regular expression.

$ find . -name "*Relevant*.pm" | xargs perl -pi -e 's/something/somethingelse/'

The find name part can be tweaked to suit your needs and the regular expression in the Perl part also.

Collapse
 
speedmeup_net profile image
SpeedMeUp • Edited

Top client IP in apache logs.

# cut -d" " -f1 apache.access.log | sort | uniq -c | sort -rn | head 349963 22.208.1.241 16434 15.99.2.62 8685 7.8.27.98 2047 52.14.4.76 265 83.12.37.3 149 3.71.24.250 78 14.213.14.6 13 182.37.3.88 

Work also for any field : top URL, top browser etc. Damn fast !