Dito, i also have a similar tool collection plus a bunch more and a one liner install script to have it installed whenever i encounter an empty new server i have to work with: kakulukia.github.io/dotfiles/
Nice list. Plenty to try there! What are your thoughts on pipenv? I’m fairly new to python and coming from ruby pipenv seems very natural to me but it doesn’t seem to be widely adopted.
I haven't actually used pipenv. If you asked me 2 months ago, I would say: "don't use it, it's abandoned", as there was no release between 2018 and 2020.04.29. But it seems to be back on track again. With my clients, I use requirements.txt + pip-tools to pin versions. That's an old-fashioned but pretty bulletproof solution. In my personal projects, I used poetry, and I liked it. But I would not use it in a commercial solution (unless the client insists, which never happened) - mainly because of the bus factor (it's still maintained by a small group of people). Also, keep in mind that I write applications, not Python packages. For writing a Python package, I would go with Poetry (it takes away a lot of hassle with packaging, etc.)
Ah, OK. I saw a lot of places where people said it was the "right/standard" way of doing things going forward. I guess that could have been wishful thinking. My main use case is writing AWS lambda functions and I like the way I can define dev dependencies separately so they don't get packaged up with the final build and impact startup time. I've never been able to find a way to do that in a requirements.txt file though. Is there an ideomatic way to deal with this? Maybe a separate requirements file?
Your intuition is correct - it can be achieved through separate requirements files. I usually have 4 of them (2 maintained by hand, 2 generated automatically):
requirements.in - list of packages that you want to include in the final build, e.g. "django". You can be even more specific: "django<3.0"
requirements.txt - pinned versions - this will be generated by the pip-tools, so you never touch this file by hand, e.g. "django==2.2.2"
requirements-dev.in- packages used by developers (not included in the final build), e.g.: "pytest"
requirements-dev.txt - pinned versions of packages for developers generated by pip-tools.
You add packages to the *.in files, generate the *.txt files with, let's say a Makefile command, and use them to install packages on the destination servers.
Thanks for the recommendations. I used thefuck in the past, but when I was switching to fish shell I realized that I never got used to using it. The most common aliases in my terminal are usually 1-3 letter-long ("gco" for "git commit"), I use "z" to move around the filesystem, and fish also adds some autocompletion, so there is little room for mistakes anymore.
xxh looks amazing! That's exactly what I needed so many times in the past! I don't ssh to servers that much anymore (damn you kubernetes taking our jobs!), but I will definitely give it a try!
I'm an experienced dev specializing in financial and data analysis applications. I've written a lot of Java, SQL, and Matlab. https://numeric-cafe.github.io
Location
Brooklyn, NY
Education
BS in Computer Science
Work
Director, Quantitative Systems at Demex Technologies
As a note/warning: fancy prompts are nice, but if you're working on a distributed filesystem like Lustre, GPFS, BeeGFS, NFS, MSDFS, or Panasas, the large volume of silent file operations the prompt engines do behind the scenes combined with the dramatically increased latency of basic filesystem calls like stat() can make them reaaaaaaally slow.
I tried some fancy Powerline stuff on one of our compute clusters once, and it was just too frustrating -- I had to remove it after less than half an hour.
Coding for 20 years | Working for startups for 10 years | Team leader and mentor | More information about me: https://thevaluable.dev/page/about/ Twitter: @Cneude_Matthieu
I believe you can run python scripts from a virtual environment without activating the environment by calling the script by it's full path. If you have a script named mygreatscript in a virtual environment in ~/Development/env/ you can run it with ~/Development/env/bin/mygreatscript
This is such an awesome post, thanks so much. I haven't spent any time optimising my terminal experience in years - I will definitely check some of these tools out.
My name is Rodolfo Silva. I am a developer located in Salvador - BA, Brazil focusing on creating great things for the web and mobile. Currently I am Software Engineer at Agilize.com.br .
Dito, i also have a similar tool collection plus a bunch more and a one liner install script to have it installed whenever i encounter an empty new server i have to work with: kakulukia.github.io/dotfiles/
Great things to check :)
More for terminal lovers (mac and linux):
The theme for zsh I love most is powelevel 10k github.com/romkatv/powerlevel10k
Ah oh my zsh have lot of intesting plugins as well ohmyz.sh/
Oh-my-zshell is truly amazing if you are on Z shell. Highly recommend!
Nice list. Plenty to try there! What are your thoughts on pipenv? I’m fairly new to python and coming from ruby pipenv seems very natural to me but it doesn’t seem to be widely adopted.
I haven't actually used pipenv. If you asked me 2 months ago, I would say: "don't use it, it's abandoned", as there was no release between 2018 and 2020.04.29. But it seems to be back on track again.
With my clients, I use requirements.txt + pip-tools to pin versions. That's an old-fashioned but pretty bulletproof solution. In my personal projects, I used poetry, and I liked it. But I would not use it in a commercial solution (unless the client insists, which never happened) - mainly because of the bus factor (it's still maintained by a small group of people).
Also, keep in mind that I write applications, not Python packages. For writing a Python package, I would go with Poetry (it takes away a lot of hassle with packaging, etc.)
Ah, OK. I saw a lot of places where people said it was the "right/standard" way of doing things going forward. I guess that could have been wishful thinking. My main use case is writing AWS lambda functions and I like the way I can define dev dependencies separately so they don't get packaged up with the final build and impact startup time. I've never been able to find a way to do that in a requirements.txt file though. Is there an ideomatic way to deal with this? Maybe a separate requirements file?
Your intuition is correct - it can be achieved through separate requirements files.
I usually have 4 of them (2 maintained by hand, 2 generated automatically):
requirements.in
- list of packages that you want to include in the final build, e.g. "django". You can be even more specific: "django<3.0"requirements.txt
- pinned versions - this will be generated by the pip-tools, so you never touch this file by hand, e.g. "django==2.2.2"requirements-dev.in
- packages used by developers (not included in the final build), e.g.: "pytest"requirements-dev.txt
- pinned versions of packages for developers generated by pip-tools.You add packages to the
*.in
files, generate the*.txt
files with, let's say a Makefile command, and use them to install packages on the destination servers.Awesome! Thanks for your help. I'll definitely try this as I feel like I'm swimming against the tide with pipenv.
Thanks for the recommendations. I used
thefuck
in the past, but when I was switching to fish shell I realized that I never got used to using it. The most common aliases in my terminal are usually 1-3 letter-long ("gco" for "git commit"), I use "z" to move around the filesystem, and fish also adds some autocompletion, so there is little room for mistakes anymore.xxh
looks amazing! That's exactly what I needed so many times in the past! I don't ssh to servers that much anymore (damn you kubernetes taking our jobs!), but I will definitely give it a try!This was very useful! I learned about several new tools from your post. Thank you!
As a note/warning: fancy prompts are nice, but if you're working on a distributed filesystem like Lustre, GPFS, BeeGFS, NFS, MSDFS, or Panasas, the large volume of silent file operations the prompt engines do behind the scenes combined with the dramatically increased latency of basic filesystem calls like
stat()
can make them reaaaaaaally slow.I tried some fancy Powerline stuff on one of our compute clusters once, and it was just too frustrating -- I had to remove it after less than half an hour.
Very nice! It's not often I see pgcli!
For those interested, I wrote an article a while back about mycli, which is very, very similar for MySQL database.
The shell is a very powerful tool! I could not live without it.
The best though is to learn a bit of bash and write scripts to automate everything boring and we do a bit too often.
Let me add my own project I use quite often, DevDash, to monitor my other side projects / landing pages / github pages from the terminal.
If I were still using MySQL, I would definitely use
mycli
, those are all awesome SQL shells.Great job on the DevDash! It looks amazing!
I believe you can run python scripts from a virtual environment without activating the environment by calling the script by it's full path. If you have a script named
mygreatscript
in a virtual environment in~/Development/env/
you can run it with~/Development/env/bin/mygreatscript
Thanks! delta and broot looks great, I will definitely check them out!
I was expecting to just share what I'm using, but instead I'm getting a whole new collection of tools to test 😁 This is awesome!
This is great, thanks for sharing. I hadn't heard of exa before, very useful to be able to see git status like this :)
I also can recommend fasd(github.com/clvv/fasd) as z alternative. It's like z but offers some additional features.
I remember using
fasd
beforez
, but I don't remember the reason why I switched. But both are equally good!thank you for sharing, very informative.
Good read. I tried most of them after reading your article. Thank you for sharing :)
OMG this is epic
I just spend hours to prettify the windows Terminal only to struck by this post.
It's awesome post, Sebastian.
Thanks for sharing there's so much gold in here.
Thank you for the kind words!
What a nice list. Surely got some of plugins into my shell. Thanks!
I am a big fan of zsh and all the stuff that adds to a shell more fun and features around some big concepts
THX for the post
Awesome post! This is pure gold mate! ;)
This is such an awesome post, thanks so much. I haven't spent any time optimising my terminal experience in years - I will definitely check some of these tools out.
mas is poor, it did not detect outdated apps which I could detect with CleanMyMac X
Add
gtop
to this list:npmjs.com/package/gtop
Looks like legit...