Shell scripting is excellent for many system operations. Many of the utilities are, er…, like 40-50 years old, and was designed for unix for such purposes. Utilities like ls
, find
, test
, grep
, uniq
, comm
, cut
, join
, rev
, sort
, head
, tail
, awk
, sed
, dc
forms a fairly good set of tools for e.g. working with various csv-style files. Prior to perl
, most system operations were done by combinations of such things, in various shells like sh
or csh
. When the clock speeds creeped upwards some 30 years ago, perl
took over as the universal tool for such purposes.
With even faster computers, python
was developed. The extra cycles, 8 MHz became 4 GHz, were mostly used for interpreting languages like python
, java
, and R
. This made various things easier, at the cost of performance. Performance isn’t that important for systems operations.
Those who required speed for technical and/or scientific computing, used compiled languages, mostly C
, C++
, and Fortran
, and gradually got several orders of magnitude speedup. Or used the, er…, Cobol magic available in python, that of using special arrays like numpy
for computation, together with a set of numpy
-tools written in C
. Or they used R
with embedded C++
code (it’s really easy in R
). Some even used assembly for critical parts of the code. This has gotten harder and harder with all the parallelism in modern cpus, so most such development is still done in C
, C++
or Fortran
, though with the newcomer julia
as an option.
There are many ways to do things. Why python
has become so popular I do not fully understand. Perhaps because universities found it suitable as an easy introduction to programming. My university (University of Oslo), switched from simula
to java
to python
as the introductory language during the last 40 years.