Dr. StrangePipes or: How I learned to stop worrying && "function" in Shell
What scares the Shell out of people?
Artefacts of the long and idiosyncratic history of Unix.
We have so many powerful, stable, widely-available Unix tools and standards, but also plenty of inconsistencies, variants, and arcane caveats; enough to help us... surprise... ourselves in creative new ways. See that cliff you're currently falling off? Yeah, someone went down that road when you were still eating chalk in kindergarten. (Or perhaps you fell off it once before, when I was eating chalk in kindergarten, but it was so long ago that you forgot you've been there done that :-))
Worse, there's no adult supervision, only user manuals. Within Shell's neon-lit environs, one must learn to tread with care, creativity, and cool calculation.
One must learn to accept the silent echo as reward for success.
Why go to Shell at all?
Several of us already get by just fine with a few memorised spells from Shell. That's not a bad way to live, but we miss out on tremendous everyday value that we can unlock with the Unix way.
For there exists a vast no-man's land between memorised tricks and large scale "designed" solutions; wherein no single software tool can really solve all the problems we encounter.
Yet, any modern *nix PC has all one needs to adequately and speedily tailor-make solutions for almost any problem we may encounter in the underserved no man's land.
Solutions that we can use effectively until we truly hit the kind of scale problems or domain complexity that merit use of specialised tools.
Unix nature is deeply Functional
The Unix tools philosophy naturally leads to functional architectures that can scale almost effortlessly. And functional programmers could do worse than learn to exploit such power.
I'll just leave you with Douglas McIlroy's answer to "Read a file of text, determine the n most frequently used words, and print out a sorted list of those words along with their frequencies.", as seen in More Shell, Less Egg:
tr -cs A-Za-z '\n' |
tr A-Z a-z |
uniq -c |
sort -rn |
A purely functional, automatically buffered data processing pipeline, built with standard reusable parts, written in 1986, directly usable on today's computers.
Imaginative Functional Programmers (aren't we all?!) will find striking parallels between the facilities, concepts, and design principles available in their $Langs, and their *nix environments.
No surprise, because in the dreamy mists of time, the wise hackers of Unix lore, and SmallTalk lore, and APL lore, and Lisp lore, and Forth lore etc... were all different, but they were all together, too.
Outline/Structure of the Workshop
We will not aim to write highly "portable"/multi-platform shell programs. That's really hard to get right, and requires time investment that will simply not pay off in the generally homogenous environments that we inhabit.
We will focus on the sweet spot occupied by everyday problems we encounter in the bounded contexts our everyday work life. It is here that we can powerfully leverage the Unix way.
We will see how to apply FP techniques to build our own "swiss army toolkits" of custom functions and utilities...
- that work seamlessly with the standard Unix model, and so
- are useful not just within scripts, but also interactively at the command line
- can interoperate with any other tool that understands the Unix way
- allowing us to get surprisingly little code to do surprisingly powerful things
We will find ways to exploit built-in host semantics and guarantees, to avoid manual state management, locking, and/or mutable book-keeping (e.g. streaming-data-oriented semantics/APIs, pipes, subshells, process substitution, good use of standard I/O/Error etc.).
We will use a standard Shell, with standard built-in *nix tools, and perhaps a few widely ported installable packages. Specifically, the system I will use to demo/teach includes:
- Ubuntu 18+ (and all the pre-packaged standard tools)
- Bash 4.4+
- Gnu AWK (i.e. gawk, which may not be standard on your linux)
- jq (for JSON processing)
- emacs, vim, tmux, and git
Things I show should port over fairly easily, if not directly, to other *nix/shell combinations.
Sooner or later, you will find yourself left with with no choice but to go to Shell (or a Shell user), searching for truth in The Machine.
So why not already learn "fn" ways to...
- enhance one's own personal workflow (type less, think and design more)
- help one's small team/group translate repetitive, manual, error-prone tasks into shared workflows, standards, and conventions
- tame oncall log analysis problems (and post-incident analysis problems too),
- solve little-data and middle-data problems (crunch a few Mega/Gigabytes to a few Terabytes of data),
- stitch together build/deploy/analytics/whatever pipelines and keep 'em running
And who knows? Perhaps you will far surpass this workshop, and manage to quietly replace yourself with tiny shell scripts that keep you earning your pay while you colonise everything in Kerbal Space.
Programmers seeking to grok Unix nature
Prerequisites for Attendees
You use a Shell on Linux / Unix systems at least at a beginners level, which means you have:
- a comfortable grasp of how to:
- navigate and manipulate the filesystem (cd, ls, mv, rm, find)
- view/search text (cat, less, more)
- observe and manipulate processes (top, ps, kill)
- some idea of how to:
- use pipes and stdin/stdout/stderr
- read and navigate manpages
- use shell globbing, expansion, substitution rules
- "StackOverflowed" some tricks to:
- process/analyze textual data (grep, and maybe also sed, tr, sort, uniq)
- automate things with shell script
- customise your shell prompt and/or bash profile
Windows shell users are welcome to attend, of course, but I apologise in advance. I don't have any experience with Windows shell environments to translate my understanding for you. However, I suspect enough design ideas will port over, to make it worth your while to participate.