this post was submitted on 28 Jun 2023
32 points (97.1% liked)

Linux

48165 readers
885 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

As a person raised by GUIs, an extra visual confirmation and an extra prompt is a nice touch. I also like when the system says "Oh, is that a directory? No problem, I'll give you the usual treatment." You know what I mean?

alias ls='ls --group-directories-first --color=auto -w 120'
alias ll='exa --group-directories-first -l'
alias la='ll -a'
alias lt='ll --tree'

alias cp='cp --recursive --interactive --verbose --reflink=always'
alias mv='mv --interactive --verbose'

# custom pwd
# - replace $HOME with ~
# - make everything before the last '/' green, and everything after white and bold
# - alias to p
alias pwd="pwd | sed 's:$HOME:~:' | sed -E 's:(.*/)([^/]+):\x1b[32m\1\x1b[0m\x1b[1m\2\x1b[0m:'"
alias p="pwd"

# custom cd.
# - prints the new directory after cd'ing.
cd () { 
    command cd "$@" && p;
}
alias c="cd"
alias '..'='c ..'
alias '...'='c ../..'

# For the '~' alias, we want to use the original cd because printing '~'
# again would be redundant.
alias '~'='command cd'

# custom rm.
# adds a '-r' flag only if there is a single argument and that argument
# is a directory.
# This is because I want the behavior of -I (interactive) to be the default,
# but I also want to have the -r flag available when I need it without being
# prompted for single files.
function rm () { 
  if [ $# -eq 1 ] && [ -d "$1" ]; then
    rm --verbose --interactive=once --recursive "$1";
  else
    rm --verbose --interactive=once "$@";
  fi;
}

# mkdir + cd (created as a function because they run on the current shell,
# which is what we want for cd)
mc () { 
  mkdir -p -- "$1" && cd -P -- "$1";
}
top 16 comments
sorted by: hot top controversial new old
[–] [email protected] 12 points 1 year ago (4 children)

The problem I have with this kind of thing is: I work on hundreds of different vms and containers and they can't all be setup like this AND have root and system accounts be setup like this. So you get too used to it one place and forget its not there when trying to troubleshoot. These days i tend to try and keep my shell simple so my skills transfer easily anywhere.

[–] [email protected] 3 points 1 year ago

Same here, I even don't have ll in my vocabulary, although it seems to be a default on Debian based systems.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

You can source a script for this directly from github using curl and proccess substitution in order to temporarily have the config when and where you are without making it the default

I do the same with vim.

Edit: here's the command:

source <(curl -s https://www.raw.githubusercontent.com/sorrybookbroke/bashingmyheadin/master/bashrc

[–] [email protected] 2 points 1 year ago

Right? I wonder why this approach isn't more common.

How do you do this with vim, btw? I've looked into it before but haven't found a fully satisfying answer yet.

[–] [email protected] 2 points 1 year ago

That's exactly the thing. I limit my configuration to basic environment variables and define sudoers using LDAP (sss). This way I can have some preferred defaults for some tools, but I don't configure many aliases.

If I really need it I package (deb, rpm...) and deploy it either as a profile file or script/program properly.

Using a big well configured bashrc/zshrc/... is more trouble than it's worth for administrators, because it doesn't transfer between environments easily and increases the mental load by a lot. Even though the idea itself is good.

[–] [email protected] 1 points 1 year ago

If you're allowed docker in your systems, build a sysadmin container with all your favorite tools. Then just run it locally and remotely with the root directory bound to /mnt or something

[–] [email protected] 11 points 1 year ago (1 children)

Only when needed. And I don't really use aliases at all for basic things like that. I don't like things like cp or mv being verbose most of the time as I don't have much use for their output and like it when commands are quite unless there is a problem as it makes it easier to see the error lines. I don't really want to be confirming every file in a recursive rm or cp or mv either - just leads to hitting y automatically and not thinking about it. I do like how zsh warns you about rm * though with the directory you are about to remove though.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

That's fair.

I don’t really want to be confirming every file in a recursive rm or cp or mv either

Ah but rm will only make you confirm if there are more than 3 files to be removed. And cp and mv only if there's risk of overwriting. And it's only one confirm per command, not per file.

[–] [email protected] 5 points 1 year ago (1 children)

You must have a different version of rm then I do:

$ rm -i a
rm: remove regular empty file 'a'? y
$ rm -ir f
rm: descend into directory 'f'? y
rm: remove regular empty file 'f/a'? y
rm: remove regular empty file 'f/b'? y
rm: remove regular empty file 'f/c'? y
rm: remove regular empty file 'f/d'? y
rm: remove regular empty file 'f/e'? y
rm: remove directory 'f'? y

Never really had an issue with cp or mv overwriting files where I didnt mean to. Possibly because I don't default recursive on all the time. So they tend to error when accidentality copying folders. And I tend to tab complete paths to see what is in them. I also tend to use rsync for copying large directories rather than just cp when the directories might not be empty to start with.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

This should work:

$ rm -Ir f
[–] [email protected] 5 points 1 year ago (1 children)

Be careful, as this can easily break many scripts.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (2 children)

shouldn't be a problem because scripts are run non-interactively and my .bashrc wouldn't be read, right?

[–] [email protected] 1 points 1 year ago

Thats actually good to know, thanks

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

See replies to this comment.

~~That's not necessarily the case. Most scripts will just run in the current environment (meaning your .bashrc will be used) and not define many/any arguments for commands like cp or rm.~~

~~Your .bashrc file is read whenever you start a command line shell, so by the time you can even run a script you probably already invoked your aliases.~~

~~Exceptions would be if you're running a script from cron or running the script from another shell like sh or zsh.~~

[–] [email protected] 3 points 1 year ago

I definitely type cp -i and mv -i by habit, and I have a ~/trash directory that I often move stuff into instead of deleting it and then periodically blow away completely. I agree with some other commenters that all the stuff in .bash_profile is maybe not a good idea, but I find that just doing those three things creates a lot more safety than just the "you said to remove it so now 5ms later it's gone forever" Unix default.

[–] [email protected] 2 points 1 year ago

For deleting specifically I use gio trash, it should work on any gnome install.

load more comments
view more: next ›