The Advent of CLI - Day 3


It is about programming alright

Now that some basic stuff are out of the way, we can dive right into the guts of things: programming.

But when it comes to the CLI this is where it get much much much more interesting, see
being able to program from the CLI means you can access most parts of your system,
and because you can program that then you can automate it too

That’s where all that stuf get its magic sauce: programming + CLI = super duper power.

I will try to demo that with some practical examples.

Programming with Bash

All the commands you can input into the CLI line by line are repetitive
and can take some time to remember it all, but you can put all those lines
into a text file and make it so it execute those commands as if you were
typing them, that’s basically a shell script.

A shell script under a Unix-like system (Linux and macOS) follow 2 basic rules

  • it has to starts with a shebang line pointing to the script interpreter
  • it has to have the executable bit

Under Windows it’s different under the CMD shell,
the file extension determine if the file is considered executable

So let’s create the classic hello world :slight_smile:

Under the command line you would directly wrote
$ echo "hello world"


hello world

for a script you would create a file
$ touch

edit this file so it contains the following lines

# define a variable
STRING="hello world"
# display the variable to the screen
echo $STRING

and to be able to execute it you make it executable
$ chmod +x

to run it in the current folder
$ ./


hello world

the ./ is necessary here
. point to the current directory
so ./file means execute the file in the current directory

Not very impressive I agree, but see it like that
when you type the full line $ echo "hello world"
you are entering one by one the characters and it takes time

but when you enter $ ./
you basically need to enter the first few chars (3 usually)
use the [TAB] shortcut so it autocomplete, and done

it is at least much few characters to write

And if we look at the file we can learn a little bit

the first line, shebang line follow this format
#!interpreter [optional-arg]

for a shell you can usually use the absolute path as #!/bin/bash

for other script interpreters you could also use #!/usr/bin/php,
but you could use env to make the script “more portable”,
for ex: #!/usr/bin/env perl

that way if on macOS your perl path is /opt/local/bin/perl
and on Linux your perl path is /usr/bin/perl, it will work for both,
read Make Linux/Unix Script Portable With #!/usr/bin/env As a Shebang for more details.

In the script you can see also comments
# define a variable and # display the variable to the screen

a little trick I learned with years of bash scripting is to comment
each single “actions”, you can see that as a prototype of what the thing is supposed to do,
and later on when you will wonder “what this thing is doing”
with some crypting barely unreadable lines, it will helps :p.

Finally, you can see how we declare a variable STRING="hello world"
and how we reuse it later echo $STRING

quick intro about bash vars

and for reference look at

Practical Examples

So you’re developer and often you start a new project and you have some habits,
you like things to be organised andor named ina certain way, you can do it “by hand”
or you can automate it with Bash.

First, you need to create script but you also want to do it in such a way
that you can access it from anywhere.

IN the command-line when you type commands the shell use the $PATH
environment variable to know where to look for those commands

for ex:
$ echo $PATH

will output:


it will look first in /usr/local/bin, then /usr/bin, then /bin, etc.
and if the command is not found you will get a message telling you it can not find it

for ex:
$ foobar


-bash: foobar: command not found

So there, by default, you could place your script in /usr/local/bin.

But bash also use different config file when the shell starts
see Bash Startup Files

Invoked as an interactive login shell, or with --login

When Bash is invoked as an interactive login shell, or as a non-interactive shell with
the --login option, it first reads and executes commands from the file /etc/profile,
if that file exists. After reading that file, it looks for ~/.bash_profile, ~/.bash_login,
and ~/.profile, in that order, and reads and executes commands from the first one that exists
and is readable. The --noprofile option may be used when the shell is started to inhibit this behavior.

When an interactive login shell exits, or a non-interactive login shell executes the exit builtin command,
Bash reads and executes commands from the file ~/.bash_logout, if it exists.

so a file like /etc/profile is a system-wide profile
and a file like ~/.profile is your current user profile

~ indicate the user home directory and automatucally expand to its path,
you can find this path by using the environment variable $HOME,
try $ echo ~, $ echo $HOME

For example, in my case, under Linux it would be /home/zwetan
and under macOS it would be /Users/zwetan.

and so usually in this profile it loads other config files like /etc/bashrc
and also look for ~/.bashrc, ~/.bash_profile, etc.

Yeah it is a bit confusing to find out what’s going on, read Shell initialization files to know all the gritty details.

That said, often in all those config files loaded from different localtion
and particular orders we found thoses lines (installed by the system)

# set PATH so it includes user's private bin if it exists
if [ -d "$HOME/bin" ] ; then

that can translate to

  • if the directory bin exists in the user home directory
  • prepend this directory in the PATH environment variables
  • if your PATH was /usr/local/bin:/usr/bin:/bin
    it updates to $HOME/bin:/usr/local/bin:/usr/bin:/bin

and this path $HOME/bin, in my case /home/zwetan/bin
is a pretty good candidate to store my own personal script files

so instead of placing your script into /usr/local/bin
you should place into $HOME/bin, provided this private bin directory exists

So let’s do that

  • navigate to our home directory
    $ cd ~
  • create bin directory if it does not exists
    $ mkdir bin
  • navigate ot that bin directory
    $ cd bin (case where we are already in the home directory)
    $ cd ~/bin (from any other location use the absolute path)
  • create the script
    $ touch aa-project-create
  • make it executable
    $ chmod +x aa-project-create

Here I used my own naming convention, eg aa-project-create, here why

  • aa I use 2 letters to categorize my scripts
    it could be aa or anything else
  • then I use project for the “subject” of the script
    eg. “it is about projects”
  • and finally create for the main “action”
    eg. create a project

That’s how I do it for myself, you can use any other way that works for you.

Why I do it like that is mainly because I manage a hell lot of scripts,
and using a naming something-otherthing-yetotherthing provide a cheap free syntax completion

Also here a little trick, in the same directory eg. $HOME/bin
I create simple script aa-cmd with this content

find ~/bin -name "aa-*" -print0 | xargs -0 ls -1  | xargs -n 1 basename

which allow me anywhere (eg. system-wide) to simply list all the scripts for a particular category
by doing a simple $ aa-cmd

Now what to put into that aa-project-create script ?

OK, my personal preferences when I create a project are pretty simple
I usually go with this structure of files and folders

{name of the project}
  |_ build
  |_ docs
  |_ src

so the script would look like that

# create new project

current_dir() {
    echo "$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"

# --- main ---


if [[ -z "${TARGET}" ]]; then
    echo "project target is missing"
    echo "eg. $ ./${0##*/} projectname"
    exit 1

if [ -d "$TARGET" ]; then
    echo "project directory already exists"
    exit 1

# create project directory
mkdir ${TARGET}

# create sub directories
mkdir ${TARGET}/build
mkdir ${TARGET}/docs
mkdir ${TARGET}/src

# create files
touch ${TARGET}/

# done
exit 0

OK at first sight it is pretty cryptic, that’s bash syntax for you, it is horrible but it got advantages too

As a scripting language bash is terse (use very few words), that’s what make it ugly’ish
but it is at the same time why you want to use it, because it use very few words.

You want things to be fast, no time to write beautyful code, no time to compiel it, etc.

You just want the thing to work as fast as posbbile and move on to the next target,
so yeah this involve a certain amount of dirty and ugly.

We gonna review line by line what this script is doing,
but let’s focus on the last line first exit 0.

Even if it is a script it is an executable and so follow the rules of executables

  • an exit code of 0 means success
  • an exit code of 1 means filaure
  • an exit code of 2 means command line usage error
  • any exit code bigger than 0 means some kind of error
    this exit code should be in the range 0-255

see Exit and Exit Status
and Exit Codes With Special Meanings

simply put

The exit command terminates a script, just as in a C program.
It can also return a value, which is available to the script’s parent process.

Every command returns an exit status (sometimes referred to as a return status or exit code).
A successful command returns a 0, while an unsuccessful one returns a non-zero value
that usually can be interpreted as an error code. Well-behaved UNIX commands, programs,
and utilities return a 0 exit code upon successful completion, though there are some exceptions.

we want to be well-behaved, so to indicates everything went well we then terminate our script with exit 0,
and we do that in case our little script get chained with other commands.

Now lets go back to the top of our script where we define a function

current_dir() {
    echo "$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"

you can declare a function with those syntax

function function_name {
    // body


function_name() {
    // body

exit status also apply to functions

Likewise, functions within a script and the script itself return an exit status.

few rules about function, otherwise look at Functions

  • a function can not be empty
  • a function can accept parameters
  • you can define local variables
  • to return a string use echo
  • to return an integer use return

In your scripts, if you want to know the exist status of the last command executed you can use a special variable $?

for ex:
$ ls -la
$ echo $?
will output 0 for success

$ foobar
$ echo $?
will output 127 for error, because foobar does not exists

Here a list of special parameters from Bash Hackers Wiki

the one that interest us for now are the positional parameters
(see Handling positional parameters)

OK, that’s it for today, I’ll expand furthermore next post