Competitive Advantage

During the Big Bang, after 10⁻⁵ seconds (ten millionths of a second), the quarks came together to form protons and neutrons. The antiquarks did the same, forming antiprotons.
But for every 10 billion antiprotons, the Universe contained 10 billion and 1 protons. The protons and antiprotons collided, and in the process, almost all of the antimatter was destroyed, leaving matter as dominant.

That’s the danger of allowing a competitor to achieve even a slight advantage.

 

References:

The Age of Spiritual Machines, by Ray Kurzweil

Competitive Advantage

What do Hackers wear?

Excerpts from my blog: Hackers and the Open Source Revolution

Strangulation Device

Hackers dress for comfort, function, and minimal maintenance rather than for appearances (some, perhaps unfortunately, take this to extremes and neglect personal hygiene). They have a very low tolerance for suits and other “business” attire; in fact, it is not uncommon for hackers to quit a job rather than conform to a dress code.

When they are somehow pushed into conforming to a dress code, they will find ways to subvert it, for example, by wearing absurd novelty ties. Most hackers I know consider a tie as a strangulation device that partially cuts off the blood supply to the brain… which explains the behaviour of tie-wearers. A tie could bestow upon you the reputation of a super-loser, a suit-wearing super-user with no clue — someone with root privileges on a UNIX system but no idea what he is doing; the equivalent of a three-year-old with a 1919 machine gun for a toy.

3-year old with Machine Gun

In times of dire stress, he may roll up his sleeves and loosen the tie about half an inch. It seldom helps.

Female hackers almost never wear visible makeup and many use none at all.

REFERENCES

  1. How to become a Hacker – an essay by Eric Steven Raymond
What do Hackers wear?

More reasons to learn Lisp

I think that it’s extraordinarily important that we in computer science keep fun in computing. When it started out, it was an awful lot of fun. Of course, the paying customers got shafted every now and then, and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful, error-free prefect use of these machines. I don’t think we are. I think we’re responsible for stretching them, setting them off in new directions, and keeping fun in the house. I hope the field of computer science never loses its sense of fun. Above all, I hope we don’t become missionaries. Don’t feel as if you’re Bible salesman. The world has too many of those already. What you know about computing other people will learn. Don’t feel as if the key to successful computing is only in your hands. What’s in your hands, I think and hope, is intelligence: the ability to see the machine as more than when you were first led up to it, that you can make it more.
– Alan J. Perlis (The first recipient of the Turing Award)

Those who read my previous scholarly article on Lisp would experience resonance of familiarity with Alan J. Perlis’s words above. Lisp, after all is all about having fun and stretching the capabilities of the computer and the programming language itself. One of the ways Lisp does that is because it is designed to be extensible. Read on for more.

Getting acquainted

There was a joke back in the 80s when Reagan’s SDI (Strategic Defense Initiative) program was in full swing that someone stole the Lisp source code to the missile interceptor program and to prove it he showed the last page of code…

)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))

No, LISP does not stand for Lots of Irritating Superfluous Parentheses.

Lisp, whose name is an acronym for LISt Processing, was designed to provide symbol-manipulating capabilities for attacking programming problems such as symbolic differentiation and integration of algebraic expression. Despite its inception as a mathematical formalism, Lisp is a practical programming language. The basic elements of Lisp includes it’s primary data structure, called the s-expression. They also include the Lisp interpreter, which is the heart of any Lisp system and is basically a machine that carries out processes described in the Lisp language. The Lisp interpreter performs computations on s-expressions through a process called evaluation. Since its earliest days, Lisp implementations have been variously interpreted or compiled, and often both. No modern commercial Lisp is without a compiler. The fact that modern Lisps often come with an interpreter as well is simply a convenience for some implementations to encourage late-binding semantics and promote flexibility, including interactive debugging.

Genesis

Guy L. Steele Jr. and Richard P. Gabriel in their paper ‘The Evolution of Lisp’¹ say that the origin of Lisp was guided more by institutional rivalry, one-upsmanship, and the glee born of technical cleverness that is characteristic of the “hacker culture” than by sober assessments of technical requirements.

How did it all start? Early thoughts about a language that eventually became Lisp started in 1956 when John McCarthy attended the Dartmouth Summer Research Project on Artificial Intelligence. Actual implementation began in the fall of 1958. These are excerpts from the essay ‘Revenge of the Nerds’² by Paul Graham:
Lisp was not really designed to be a programming language, at least not in the sense we mean today. What we mean by a programming language is something we use to tell a computer what to do. McCarthy did eventually intend to develop a programming language in this sense, but the Lisp that we actually ended up with was based on something separate that he did as a theoretical exercise– an effort to define a more convenient alternative to the Turing Machine. As McCarthy said later, “Another way to show that Lisp was neater than Turing machines was to write a universal Lisp function and show that it is briefer and more comprehensible than the description of a universal Turing machine. This was the Lisp function eval…, which computes the value of a Lisp expression…. Writing eval required inventing a notation representing Lisp functions as Lisp data, and such a notation was devised for the purposes of the paper with no thought that it would be used to express Lisp programs in practice.”

What happened next was that, some time in late 1958, Steve Russell, one of McCarthy’s grad students, looked at this definition of eval and realized that if he translated it into machine language, the result would be a Lisp interpreter.

This was a big surprise at the time. Here is what McCarthy said about it later in an interview:
“Steve Russell said, look, why don’t I program this eval…, and I said to him, ho, ho, you’re confusing theory with practice, this eval is intended for reading, not for computing. But he went ahead and did it. That is, he compiled the eval in my paper into [IBM] 704 machine code, fixing bugs, and then advertised this as a Lisp interpreter, which it certainly was. So at that point Lisp had essentially the form that it has today….”

Suddenly, in a matter of weeks I think, McCarthy found his theoretical exercise transformed into an actual programming language – and a more powerful one than he had intended.

Bottom-up programming

Lisp is designed to be extensible: it lets you define new operators yourself. This is possible because the Lisp language is made out of the same functions and macros as your own programs. So it’s no more difficult to extend Lisp than to write a program in it. In fact, it’s so easy (and so useful) that extending the language is standard practice. As you’re writing your program down toward the language, you build the language up toward your program. You work bottom-up, as well as top-down.

Almost any program can benefit from having the language tailored to suit its needs, but the more complex the program, the more valuable bottom-up programming becomes. A bottom-up program can be written as a series of layers, each one acting as a sort of programming language for the one above. TEX was one of the earliest programs to be written this way. You can write programs bottom-up in any language, but Lisp is far the most natural vehicle for this style.

Bottom-up programming leads naturally to extensible software. if you take the principle of bottom-up programming all the way to the topmost layer of your program, then that layer becomes a programming language for the user. Because the idea of extensibility is so deeply rooted in Lisp, it makes the ideal language for writing extensible software.

Working bottom-up is also the best way to get reusable software. The essence of writing reusable software is to separate the general from the specific, bottom-up programming inherently creates such a separation.

Instead of devoting all your effort to writing a single, monolithic application, you devote part of your effort to building a language, and part to writing a (proportionately smaller) application on top of it. What’s specific to this application will be concentrated in the topmost layer. The layers beneath will form a language for writing applications like this one – and what could be more reusable than a programming language?

Rapid prototyping

Lisp allows you not just to write more sophisticated programs, but to write them faster. Lisp programs tend to be short – the language gives you bigger concepts, so you don’t have to use as many. As Frederick Brooks (best-known for his book ‘The Mythical Man-Month’) has pointed out, the time it takes to write a program depends mostly on its length. So this fact alone means that Lisp programs take less time to write.The effect is amplified by Lisp’s dynamic character: in Lisp the edit-compile-test cycle is so short that programming is real time.

Bigger abstractions and an iterative environment can change the way organizations develop software. The phrase rapid prototyping describes a kind of programming that began with Lisp: in Lisp, you can often write a prototype in less time than it would take to write the spec for one. What’s more, such a prototype can be so abstract that it makes a better spec than one written in English. And Lisp allows you to make a smooth transition from prototype to production software. When Common Lisp programs are written with an eye to speed and compiled by modern compilers, they run as fast as programs written in any other high-level language.

All of this obviously means that you can now spend less time working and finally take your family out for that dinner you’ve been promising for the last three years – happy boss, happy family.

Macros

Macros may be the single most important feature why Lispers put up with all those annoying parentheses in their code. These very parentheses enable this powerful macro system in Lisp. Paul Graham, who is as near to some “Lisp missionary” as possible, points out that Lisp code is made out of Lisp data objects. And not in the trivial sense that the source files contain characters, and strings are one of the data types supported by the language. Lisp code, after it’s read by the parser, is made of data structures that you can traverse.
If you understand how compilers work, what’s really going on is not so much that Lisp has a strange syntax (parentheses everywhere!) as that Lisp has no syntax. You write programs in the parse tress that get generated within the compiler when other languages are parsed. But these parse trees are fully accessible to your programs. You can write programs that manipulate them. In Lisp, these programs are called macros. They are programs that write programs.

Doug Hoyte, author of the book ‘Let Over Lambda’³ gives a lot of credit to macros for efficient Lisp performance. He says that while other languages give you small, square tiles, Lisp lets you pick tiles of any size and of any shape. With C, programmers use a language that is directly tied to the capabilities of a fancy fixnum adder. Aside from procedures and structures, little abstraction is possible in C. By contrast, Lisp was not designed around the capabilities and limitations of the machines. But surely other languages can be written in less efficient, more convenient ways.

Instead of inquiring what makes a program fast, it’s better to ask what makes a program slow. The root causes can be roughly classified into three broad categories:

  1. Bad algorithms
  2. Bad data-structures, and
  3. General code

All language implementations need good algorithms. An algorithm is a presumably well-researched procedural description of how to execute a programming task. Because the investment required in coming up with an algorithm is so much larger than that of implementing one, the use of algorithms is ubiquitous throughout all of computer science. Somebody has already figured out how, and why, and how quickly, an algorithms works; all you have to do to use an algorithm is translate its pseudo-code into something that your system can understand. Because Common Lisp implementations have typically been well implemented and continuously improved upon for decades, they generally employ some of the best and quickest algorithms around for most common tasks.

Good data structures are also necessary for any decent programming language. Data structures are so important that ignoring them will cause any language implementations to slow to a crawl. Optimizing data structures essentially comes down to a concept called locality – which basically says that data that is accessed most frequently should be the fastest to access. Data structures and locality can be observed clearly at almost every level of computing, where performance gains have been sought: large sets of CPU registers, memory caches, databases, and caching network proxies to name a few. Lisp offers a huge set of standard data structures and they are generally implemented very well.

If Lisp provides such good algorithms and data-structures, how is it even possible that Lisp code can be slower than code in other languages? The explanation is based on the most important design decision of Lisp: general code, a concept otherwise familiar to us as duality of syntax. When we write Lisp code, we use as many dualities as possible. The very structure of the language encourages us to. Past of the reason why Lisp programs are usually much shorter than programs in other programming languages is because any given piece of Lisp code can be used for so much more than can a corresponding piece of code in the other language such that you can re-use it more often. Coming from another programming language perspective, it can feel unusual to have to write more to get less, but this is an important Lisp design decision – duality of syntax. The more dualities attached to each expression, the shorter a program seems to be.

Does this mean that to achieve or exceed C’s performance we need to make our Lisp programs as long and dangerous as their corresponding C programs? No, Lisp has macros.

A great medium to express Recursion

Recursion is the act of defining an object or solving a problem in terms of itself. Properly used, recursion is a powerful problem solving technique, both in artificial domains like mathematics and computer programming, and in real life.

The power of recursion evidently lies in the possibility of defining an infinite set of objects by a finite statement. In the same manner, an infinite number of computations can be described by a finite recursive program, even if this program contains no explicit repetitions.4

Lisp is the best programming language to use when working with recursive problems. Daniel P. Friedman and Matthias Felleisen demonstrate this case for Lisp in their book ‘The Little Lisper’5. Lisp is inherently symbolic – the programmer does not have to make an explicit mapping between the symbols of his own language and the representations in the computer. Recursion is Lisp’s natural computational mechanism; the primary programming activity is the creation of (potentially) recursive definitions.

Functional vs. Object Oriented

In rare moments of self-reflection, when I allow myself to doubt my skills as a Lisp evangelist, I sometimes wonder if I have left behind some of my fellow programmers who favor Object Oriented style of programming. Just because I have been focusing on Lisp as a functional programming paradigm, doesn’t mean we don’t have a role for you in our plans of world domination. Here’s where you fit in.

With a OO approach, a programmer writes code that describes in exacting detail the steps that the computer must take to accomplish the goal. She focuses on how to perform tasks and how to track changes in state. She would use loops, conditions, and method calls as her primary flow control and instances of structures or classes as primary manipulation unit. OO tries to control state behind object interfaces.

In contrast, FP involves composing the problem as a set of functions to be executed. An FP programmer focuses on what information is desired and what transformations are required by carefully defining the input to each function, and what each function returns. She would use function calls including recursion as her primary flow control and functions as first-class objects and data collection as primary manipulation units. FP tries to minimize state by using pure functions as much as possible.

OO makes code understandable by encapsulating moving parts.
FP makes code understandable by minimizing moving parts.6

Functional Programming is the art of writing programs that work by returning values, instead of modifying things. Functional Programming also enables you to create fabulously powerful and very efficient abstract programs. Functional Programming is a mathematical approach to programming. In math, for at least the last four hundred years, functions have played a much more central role. Functions express the connection between parameters (the “input”) and the result (the “output”) of certain processes. In each computation the result depends in a certain way on the parameters. Therefore a function is a good way of specifying a computation. This is the basis of the functional programming style. A ‘program’ consists of the definition of one or more functions. With the ‘execution’ of a program the function is provided with parameters, and the result must be calculated. Writing code in a functional style guarantees that a function does only one thing (returns a value) and is dependent on one thing (the parameters passed to it). This equips you to control side effects.

Conard Barski, author of ‘Land of Lisp’7, points out that the critics of OO programming style may complain that object-oriented techniques force data to be hidden away in lot of disparate places by requiring them to live inside many different objects. Having data located in disparate places can make programs difficult to understand, especially if that data changes over time. Therefore, many Lispers prefer to use functional techniques over object-oriented techniques, though the two can often be used together with some care. Nonetheless, there are still many domains in which object-oriented techniques are invaluable, such as in user interface programming or simulation programming.

However, some side effects are almost always necessary for a program to actually do something. This means that you can’t write a useful program that has the entirety of its code written in the functional style. James Hague in his assessment of functional programming argues that “100% pure functional programming doesn’t work. Even 98% pure functional programming doesn’t work. But if the slider between functional purity and 1980s BASIC-style imperative messiness is kicked down a few notches – say to 85% – then it really does work. You get all the advantages of functional programming, but without the extreme mental effort and unmaintainability that increases as you get closer and closer to perfectly pure.”

So, if OO is what gets you going, Common Lisp offers the most sophisticated object-oriented programming framework of any major programming language. It’s called Common Lisp Object System (CLOS). It is customizable at a fundamental level using the Metaobject Protocol (MOP). It’s claimed that there’s really nothing like it anywhere else in programming. It lets you control incredibly complex software without losing control over the code.

Two common Lisp myths shattered

Myth 1: Lisp is slow because it is interpreted
Common Lisp is not an interpreted language. In fact, there is not even a reasonable definition of “interpreted language”. The only two reasonable definitions of “interpreted language” that I can think of:
A language that can be implemented with an interpreter,
A language that must be implemented with an interpreter.

In the first case, all languages are interpreted. In the second case, no language is interpreted.

Sometimes, we confuse interpreted and interactive. We tend to think that whenever there is an interactive loop such as the Lisp’s read-eval-print-loop, there must also be an interpreter. That is false. The eval part can very well be implemented with a compiler. Sometimes, the belief is that even though it is possible to implement Common Lisp with a compiler, it is usually done with an interpreter, and hence most implementations are slow. This might be the case for programs written by amateurs, but is seldom the case with systems written by hackers. Almost every Common Lisp system in existence uses a compiler. Most Common Lisp compilers are capable of generating very good, vary fast code. Obtaining fast code may require the hacker to add declarations that assist the compiler in optimizing the code, such that fast object code is not derived out of naïve source code.

Also, mortals have a tendency to exaggerate the importance of speed. There are quite a number of applications that use very slow systems, such as Perl, Shell, Tcl/Tk, Awk, etcetera. It is not true that maximum speed is always required.

Myth 2: Lisp is not used in industry (I’ve not seen Lisp in advertisement for employment)
Lisp is used. Several major commercial software packages are written in Common Lisp or some other Lisp variant. It is hard to know in what language commercial software is written (since the user should not have to care), but there are a few that are well known. Interleaf, a documentation system, is written in Lisp. So is AutoCAD, a system for computer-aided design. Both are major applications in their domains. While not a commercial software system, Emacs is an important system written in Lisp.
But even if Common Lisp were not used at all in industry, this would not be a good argument. The level of sophistication of the software industry is quite low with respect to programming languages, tools, and methods. The university should teach advanced languages, tools and methods with the hope of having industry use them one day, as opposed to teaching bad ones that happen to be used today. Students who want training in particular tools that happen to be demanded at the moment, should quit the university and apply for more specific training programs.

Lisp was and is one of the dominant languages for Artificial Intelligence (AI) programming, but should not be seen as a language that it is exclusive to the AI world. The AI community embraced the language in the 1980s because it enabled rapid development of software in a way that was not possible with other mainstream languages of the time, such as C. In the 1980s and early 1990s, the emphasis of mainstream software development methodologies was on ‘getting it right first time’, an approach that demanded up-front effort on careful system specification and did not allow for changes in specification during later stages of the software development lifecycle. In AI, software development needed to be much more agile, so that inevitable changes in specification could more easily be accommodated by iterative development cycles. Lisp is ideal for this, as it can be tested interactively and provides for concise, quick coding using powerful high-level constructs such as list processing and generic functions. In other words, Lisp was ahead of its time because it enabled agile software development before it became respectable in mainstream software.

Besides, whether a language such as Common Lisp is used in industry depends a lot more on the individual student than on industry. There is a widespread myth among students that industry is this monolithic entity whose tools and methods cannot be altered. In reality, industry consists of people. Whatever industry uses is whatever the people working there use. Instead of refusing to learn sophisticated tools and techniques, the student can resolve to try to become one of the forerunners in industry after graduation.

Lisp: God’s own programming language

I would like to think that this article and it’s precursor strikes a chord with all of you – the ones who’ve fallen in love with lisp, the ones who still don’t get what all the fuss is about and the ones who need some more nudging to fall off the fence (onto the right: lisp-loving-side). For those of you in the second and third category, here’s your final push:

For God wrote in Lisp code
When He filled the leaves with green.
The fractal flowers and recursive roots:
The most lovely hack I’ve seen.
And when I ponder snowflakes,
never finding two the same,
I know God likes a language
with its own four-letter name.

– Partial lyrics of the song “Eternal Flame” (http://www.gnu.org/fun/jokes/eternal-flame.html)

REFERENCES

  1. ‘The Evolution of Lisp’ by Guy L. Steele Jr. and Richard P. Gabriel available here
  2. Revenge of the Nerds, Paul Graham
  3. Let Over Lambda, Doug Hoyte
  4. Algorithms + Data Structures = Programs. Wirth, Niklaus (1976). Prentice-Hall
  5. The Little Lisper, Daniel P. Friedman and Matthias Felleisen
  6. From Michael Feathers (http://twitter.com/#!/mfeathers/status/29581296216)
  7. Land of Lisp, Conrad Barski
More reasons to learn Lisp

Reasons to learn ANSI Common Lisp

Lisp has been hailed as the world’s most powerful programming language. But only few programmers use Lisp because of its cryptic syntax and academic reputation, which is rather unfortunate since Lisp isn’t that hard to grasp. Only the top percentile of programmers use Lisp. If you want to be among the crème de la crème, read on…

This sucks!

“I am gonna hate my job!” Those were my initial thoughts when I received an assignment at work a few years ago. I had been asked to leverage a module written in Lisp. My perception of Lisp was that of an ancient functional programming language with a cryptic syntax used only by academicians & scientists to conduct experiments in the domain of Artificial Intelligence. And those parentheses in the syntax were enough to drive anyone crazy! LISP – Lots of Irritating Superfluous Parentheses?

At that time, I believed that I was an ace at a cool, new age Object Oriented programming language. This programming language was the medium of my expression: I ate, drank, and dreamt in that language. Because I could produce high magic with it, I was revered as the exalted canonical wizard among the confrere developer community. It made me the God of my machine universe. Through it’s syntax I would say when it’s sunny and when it rains. And my machine universe would comply. I ruled over the entities of my creation with an iron first without the consequence of defiance or the fear of yet another Jasmine Revolution.

I also believed I was someone exceptionally attractive to women, spent hours in-front of the mirror doing my hair, and drove my sluggish & worn-out Kinetic Honda as if it was a 1340cc Suzuki Hayabusa.

I now know better…

Déjà vu

Those of us who witnessed the shift from non-structured programming paradigm to procedural programming and then towards object-oriented programming will relate to Paul Graham when he says in his book ‘Hackers & Painters’ something to the tune of “You can’t trust the opinion of others about which programming language will make you a better programmer. You’re satisfied with whatever programming language you happen to use, because it dictates the way you think about programs. I know this from my own experience, as a high school kid writing programs in BASIC. That language didn’t even support recursion. It’s hard to imagine writing programs without using recursion, but I didn’t miss it at the time. I thought in BASIC. And I was a whiz at it. Master of all I surveyed.”

Three weeks into hacking Lisp, I had a feeling of déjà vu – the previous experience being when I first ‘progressed’ from BASIC to C and from C to C++ and Java. With each leap, I would be happily surprised with the growing power (of programming) at my fingertips. Time and again I would wonder how did I code without Objects, Methods, Encapsulation, Polymorphism, Inheritance, etcetera? One may say that it was ‘syntactic sugar’ at work.

But not with Lisp. Lisp is pure ecstasy. It’s not just beautiful, but strangely beautiful.

In his famous essay ‘How to become a Hacker’, Eric Steven Raymond (author of many best sellers including ‘The Cathedral and the Bazaar’) writes “LISP is worth learning for the profound enlightenment experience you will have when you finally get it. That experience will make you a better programmer for the rest of your days, even if you never actually use LISP itself a lot.”

Lisp enlightens you as a hacker

What’s so great about Lisp? How does it enlighten you as a hacker? Lisper Paul Graham explains this so proficiently and methodically that it will be inappropriate to answer this questions in any other words than his. The five languages (Python, Java, C/C++, Perl, and Lisp) that Eric Raymond recommends to hackers fall at various points on the power continuum. Where they fall relative to one another is a sensitive topic. But I think Lisp is at the top. And to support this claim I’ll tell you about one of the things I find missing when I look at the other four languages. How can you get anything done in them, I think, without macros?

Many languages have something called a macro. But Lisp macros are unique. Lisp code is made out of Lisp data objects. And not in the trivial sense that the source files contain characters, and strings are one of the data types supported by the language. Lisp code, after it’s read by the parser, is made of data structures that you can traverse.

If you understand how compilers work, what’s really going on is not so much that Lisp has a strange syntax (parentheses everywhere!) as that Lisp has no syntax. You write programs in the parse tress that get generated within the compiler when other languages are parsed. But these parse trees are fully accessible to your programs. You can write programs that manipulate them. In Lisp, these programs are called macros. They are programs that write programs. (If you ever were to enter The Matrix, you’d be happy that you are a Lisp maestro).

We know that Java must be pretty good, because it is the cool programming language. Or is it? Within the hacker subculture, there is another language called Perl that is considered a lot cooler than Java. But there is another, Python, whose users tend to look down on Perl, and another called Ruby that some see as the heir apparent of Python. If you look at these languages in order, Java, Perl, Python, Ruby, you notice an interesting pattern. At least, you notice this pattern if you are a Lisp hacker. Each one is progressively more like Lisp. Python copies even features that many Lisp hackers consider to be mistakes. And if you’d shown people Ruby in 1975 and described it as a dialect of Lisp with syntax, no one would have argued with you.

Programming languages have almost caught up with 1958! Lisp was first discovered by John McCarthy in 1958, and popular programming languages are only now catching up with the ideas he developed then.

Lisp enlightens you as a individual

All the married men would relate to Steven Levy when he illustrates an example of how hackers think in his book ‘Hackers: Heros of the Computer Revolution’. Marge Saunders would drive into the garage one of the weekend mornings and upon her return would ask her husband, Bob, “Would you like to help me bring in the groceries?” He would reply, “No”. Stunned, she would drag in the groceries herself. After the same thing occurred a few times, she exploded, hurling curses at him and demanding to know why he said no to her question.

“That’s a stupid question to ask”, he said. “Of course I won’t like to help you bring in the groceries. If you ask me if I’ll help you bring them in, that’s another matter.”

When I used to program in my favorite OO programming language my response was no different. Luckily for me I discovered Lisp. It gave me a holistic view of the self, the cosmos, and that there are better responses to a question than a simple Yes/No. From then on, I learnt that the right answer to a question like that would be “Sure, Dear! Do you need me to do anything else for you?”. Needless to say my wife is a happier person and we celebrated our seventh anniversary last week.

The Functional Programming Edge

In his famous paper ‘Why Functional Programming Matters’, computer scientist R. John M. Hughes says that conventional languages place conceptual limits on the way problems can be modularized. Functional languages push those limits back. Two features of functional languages in particular, higher-order functions and lazy evaluation, can contribute greatly to modularity. As an example, Lisp allows us to manipulate lists and trees, program several numerical algorithms, and implement the alpha-beta heuristic (an algorithm from Artificial Intelligence used in game-playing programs). Since modularity is the key to successful programming, functional languages are vitally important to the real world.

Getting started

Any language that obeys the central principles of Lisp is considered a Lisp dialect. However, the vast majority of the Lisp community uses two Lisps: ANSI Common Lisp (often abbreviated CL) and Scheme. Here, I will be exclusively talking about the ANSI Common Lisp dialect, the slightly more popular of the two.

Many great Lisp compilers are available, but one in particular is easiest to get started with: CLISP, an open source Common Lisp. It is simple to install and runs on any operating system. Mac users may want to consider LispWorks, which will be easier to get running on their machines.

Installing CLISP

You can download a CLISP installer from http://clisp.cons.org/. It will run on Windows platform, Macs, and Linux variants. On Windows, you simply run an installed program. On a Mac, there are some additional steps, which are detailed on the website.

On a Debian-based Linux machine, you should find that CLISP already exists in your standard sources. Just type apt-get install clisp at the command line, and you’ll have CLISP installed automatically.

For other Linux distributions (Fedora, SUSE, etcetera), you can use standard packages listed under “Linux packages” on the CLISP website.

Starting it up

To run CLISP, type clisp from your command line. If all goes according to plan, you’ll see the following prompt:

Starting CLISP

Like all Common Lisp environments, CLISP will automatically place you into a read-eval-print-loop (REPL) after you start it up. This means you can immediately start typing in Lisp code. Try it out by typing (* 7 (+ 4 3)). You’ll see the result printed below the expression:

[1]> (* 7 (+ 4 3))
49

In the expression (* 7 (+ 4 3)), the * and the + are called the operator, and the numbers 7, 4, and 3 are called the arguments. In everyday life, we would write this expression as ((4 + 3) * 7), but in Lisp we put the operators first, followed by the arguments, with the whole expression enclosed in a pair of parenthesis. This is called prefix notation, because the operator comes first.

By the way…if you make a mistake, and CLISP starts acting crazy, just type :q and it’ll fix everything. When you want to shut down CLISP, just type (quit).

What’s under the hood?

Let’s not go down the traditional route of starting with the A B C’s (learning the syntax of the language, it’s core features, etcetera). Sometimes, the promise of what lies beneath is more tantalizing than baring it all.

Conrad Barski (author of Land of Lisp, a great book on Lisp programming for beginners) gets you excited about Lisp by showing you how to write a game in it. Let’s adopt his method and write a simple command-line interface game using Binary Search algorithm. We know that Binary Search technique continually divide the data in half, progressively narrowing down the search space until it finds a match or there are no more items to process.

It’s the classic guess-my-number game. Ask your friend (or better, your non-technical boss who yelled at you the last time you fell asleep in the meeting) to pick a number between 1 and 100 (in his head) and your Lisp program would guess it in no more than 7 iterations.

This is how Barski explains the game:
To create this game, we need to write three functions: guess-my-number, smaller, and bigger. The player simply calls these functions from the REPL. To call a function in Lisp, you put parentheses around it, along with any parameters you wish to give the function. Since these particular functions don’t require any parameters, we simply surround their names in parentheses when we enter them.

Here’s the strategy behind the game:

  1. Determine the upper and lower (big and small) limit of the players’ number. In our case the smallest possible number would be 1 and the biggest would be 100.
  2. Guess a number in between these two numbers.
  3. If the player says the number is smaller, lower the big limit.
  4. If the player says the number is bigger, raise the small limit.
  5. We’ll also need a mechanism to start over with a different number.

In Common Lisp, functions are defined with defun, like this:

defun function_name(arguments)
...)

As the player calls the functions that make up our game, the program will need to track the small and big limits. In order to do this, we’ll need to create two global variables called *small* and *big*. A variable that is defined globally in Lisp is called a top-level definition. We can create new top-level definitions with the defparameter function.

> (defparameter *small* 1)
*SMALL*
> (defparameter *big* 100)
*BIG*

The asterisks surrounding the names *big* and *small* – affectionately called earmuffs – are completely arbitrary and optional. Lisp sees the asterisks as part of the variable names and ignores them. Lispers like to mark all their global variables in this way as a convention, to make them easy to distinguish from local variables, which we’ll discuss in later articles.

Also, spaces and line breaks are completely ignored when Lisp reads in your code.

Now, the first function we’ll define is guess-my-number. This function uses the values of the *big* and *small* variables to generate a guess of the player’s number. The definition looks like this:

> (defun guess-my-number()
(ash (+ *small* *big*) -1))
GUESS-MY-NUMBER

Whenever we run a piece of code like this in the REPL, the resulting value of the entered expression will be printed. Every command in ANSI Common Lisp generates a return value. The defun command, for instance, simply returns the name of the newly created function. This is why we see the name of the function parroted back to us in the REPL after we call defun.

What does this function do? As discussed earlier, the computer’s best guess in this game will be a number in between the two limits. To accomplish this, we choose the average of the two limits. However, if the average number ends up being a fraction, we’ll want to use a near-average number, since we’re guessing only whole numbers.

We implement this in the guess-my-number function by first adding the numbers that represent the high and low limits, then using the arithmetic shift function, ash, to halve the sum of the limits and shorten the results. The built-in Lisp function ash looks at a number in binary form, and then shifts its binary bits to the left or right, dropping any bits lost in the process. For example, the number 11 written in binary is 1011. We can move the bits in this number to the left with ash by using 1 as the second argument.

> (ash 11 1)
22

We can move the bits to the right (and lop off the bit on the end) by passing in -1 as the second argument:

> (ash 11 -1)
5

Let’s see what happens when we call our new function:

> (guess-my-number)
50

Since this is our first guess, the output we see when calling this function tells us that everything is working as planned: The program picked the number 50, right between 1 and 100.

Now we’ll write our smaller and bigger functions. Like guess-my-number, these are global functions defined with defun:

> (defun smaller()
(setf *big* (1- (guess-my-number)))
(guess-my-number))
SMALLER

> (defun bigger()
(setf *small* (1+ (guess-my-number)))
(guess-my-number))
BIGGER

First, we use defun to start the definition of a new global function smaller. Next, we use the setf function to change the value of our global variable *big*. Since we know the number must be smaller than the last guess, the biggest it can now be is one less than that guess. The code (1- (guess-my-number)) calculates this: It first calls our guess-my-number function to get the most recent guess, and then it uses the function 1-, which subtracts 1 from the result.

Finally, we want our smaller function to show us a new guess. We do this by putting a call to guess-my-number as the final line in the function body. This time, guess-my-number will use the updated value of *big*, causing it to calculate the next guess. The final value of our function will be returned automatically, causing our new guess (generated by guess-my-number) to be returned by the smaller function.

The bigger function works in exactly the same manner, except that it raises the *small* value instead. After all, if you call the bigger function, you are saying your number is bigger that the previous guess, so the smallest it can now be (which is what the *small* variable represents) is one more than the previous guess. The function 1+ simply adds 1 to the value returned by guess-my-number.

To complete our game, we’ll add a function start-over to reset our global variables:

> (defun start-over()
(defparameter *small* 1)
(defparameter *big* 100)
(guess-my-number))

As you can see, the start-over function resets the values of *small* and *big* and then calls guess-my-number again to return a new starting guess. Whenever you want to start a brand-new game with a different number, you can call this function to reset the game.

Here’s our game in action, with the number 74 as our guess:

The game in action

Power corrupts. Lisp is power. Study it hard. Be evil. And let’s plan for world domination!

Reasons to learn ANSI Common Lisp

Hackers and the Open Source Revolution

This piece corrects the confusion created by mainstream media between “hacker” and “cracker”. It also considers the history, nature, attributes, ethics and attire of hackers, plus more. Interested in being one yourself, or checking why other people treat you as if you don’t fit into “normal” society? Read on…

The new generation of hackers are turning open source into a powerful force in today’s computing world. They are the heirs to an earlier hacking culture that thrived in the 1960s and 1970s when computers were still new — part of a community that believed software should be shared and that all would benefit as a result.

These expert programmers and networking wizards trace their lineage back to the first time-sharing minicomputers and the earliest ARPAnet experiments. The members of this community coined the term “hacker”. Hackers built the Internet and made the UNIX operating system what it is today. Hackers run Usenet and make the World Wide Web work.

Thanks to the advent of relatively low-cost computers and the Internet, the new hackers are immeasurably more numerous, more productive, and more united than their forebears. They are linked by a common goal — of writing great software; and by a common code — that such software should be freely available to all.

Hackers sparked the open source revolution

In 1991, Linus Torvalds sent a posting to an Internet newsgroup, asking for advice on how to make a better operating system. His project was a hobby, he said, and would never be “big and professional”. In 1994, the first working version of Linux was distributed.

Marleen Wynants and Jan Cornelis, while discussing the economic, social, and cultural impact of Free and Open Source Software in their paper “How Open is the Future?” suggest that Linux was more than just a toy for hackers. Propelled by Linux, the open source hacker culture surfaced from its underground location. Amateur hacker programmers began to create coalitions with more established parts of the software production and distribution sector. New companies and organisations were founded, while new products, licenses and communities were created.

In the spring of 1997, a group of leaders in the free software community assembled in California. This group included Eric Raymond, Tim O’Reilly, and VA Research president Larry Augustin, among others. Their concern was to find a way to promote the ideas surrounding free software among people who had formerly shunned the concept. They were concerned that the Free Software Foundation’s “anti-business message” was keeping the world at large from really appreciating the power of free software.

At Eric Raymond’s insistence, the group agreed that what they lacked to a great extent was a marketing campaign devised to win mindshare, and not just market share. Out of this discussion came a new term to describe the software they were promoting: open source. A series of guidelines were crafted to describe software that qualified as open source. While there has been a hacker subculture developing open source applications and Internet protocols for many years, without explicitly using the label “open source”, it is only in the last few years, after this conference, that this practice has become visible to a broader public.

In 1998, Microsoft’s anxiety leaked out through what is now known as the Halloween Documents. These documents comprised a series of confidential Microsoft memos on potential strategies relating to free software, open source software, and to Linux in particular. Among the leaked documents were a series of responses to the original memos.

The leaked documents and responses were published by Eric Raymond during Halloween 1998. Forced to concede that the memos did indeed originate from within the company, Microsoft dismissed them as the private speculations of a couple of engineers. “Linux has been deployed in mission-critical, commercial environments with an excellent pool of public testimonials,” Vinod Valloppillil, one of the memos’ authors, had noted.

The documents also acknowledged that open source software “…is long-term credible … FUD (spreading Fear, Uncertainty, and Doubt) tactics cannot be used to combat it,” and “Recent case studies (the Internet) provide very dramatic evidence … that commercial quality can be achieved/exceeded by OSS projects.”

FUD was a traditional Microsoft marketing strategy, acknowledged and understood internally. Examples of Microsoft’s FUD tactics included announcing the launch of non-existent products or spreading rumours that competing products would cause Windows to crash.

So, who are these hackers?

Should you happen to bump into them and inquire about their craft, hackers will gleefully inform you that programming is the most fun you can have with your clothes on… although clothes are not mandatory.

A hacker is someone who enjoys exploring the details of computers and how to stretch their capabilities, as opposed to most users, who prefer to learn the minimum necessary. Originally, “hacker” was a term of respect, used among computer programmers, designers, and engineers. The hacker was one who created original and ingenious programs.

To programmers, “hackers” connote mastery in the most literal sense: those who can make a computer do what they want it to — whether the computer wants to or not. Unfortunately, this term has been abused by the media to give it a negative connotation — of someone who breaks into systems, destroys data, steals copyrighted software and performs other destructive or illegal acts with computers and networks.

The term that accurately defines that kind of person is “cracker”.

Hackers carry stacks of ideas teetering in their heads at any given time. Their brains cannot stop collecting, consuming, or taking things apart, only to reassemble them again. But what seems to drive them is an intense ability, even a need, for analysis and organisation. When hackers encounter a technology for the first time, they do not just absorb the general shape, but go straight for the details. They feed on the logic of technology. When they do communicate, they can speak and write with great precision about what they’ve learned.

The hacker attitude

Hackers solve problems and build things, and they believe in freedom and voluntary mutual help. The hacker mindset is not confined to the realm of software (or hardware). The hacker nature is independent of the particular medium the hacker works in.

Hackerism ideas have travelled beyond the computer industry. The ideals of the hacker culture could apply to almost any activity one pursues with passion. Burrell Smith, a key member of the team that created the Apple Macintosh computer, says, “Hackers can do almost anything and be a hacker. You can be a hacker carpenter. It’s not necessarily high-tech. I think it has to do with craftsmanship, and caring about what you’re doing.”

In his book Biopunk, Marcus Wohlsen reasons that the primal urge to tinker is an essential prerequisite to being a hacker. In the hands of its most gifted practitioners, tinkering is an essential form of creativity. But it is a different brand of creativity, practised in a different spirit, than the kind suggested by the romantic image of the lone artist or genius inventor trying to wrestle inspiration out of nothing.

Tinkering in a generic sense is fiddling or tweaking, spending the weekend in the garage trying to squeeze a few more horsepower out of the Yamaha FZ 16. But it still retains the idea of “work that is not really work”. Jacking up your shocks and putting in balloon tyres on your Willy’s Jeep is not something you do because you have to. Tinkering is work you do for fun.

Hackers embrace the playfulness of tinkering, but here’s the mischief in their creed: just because the work is fun does not mean it is unimportant. “Playing”, in the hacker sense of the word, is not just a way to stay entertained. It is an attitude toward innovation that champions gamesmanship and admires intellect applied with competitive vigour and flair.

In chess, the grandmaster and the goat each play with the same sixteen pieces. But in the hands of the former, the game becomes an object of beauty and raw intellectual force. In the same way, the gifted tinkerer can rearrange the already existing engine parts or snippets of computer code in a way that creates something utterly new and potentially transformative.

For hackers, the logical frame of mind required for programming spills over into more commonplace activities. You could ask hackers a question and sense their mental accumulators processing bits until they came up with a precise answer to the question you asked.

Marge Saunders would drive to the Safeway supermarket every Saturday morning in her Volkswagen and on her return would ask her husband, “Would you like to help me bring in the groceries?” Bob Saunders would reply, “No.” Stunned, Marge would drag in the groceries herself. After this occurred a few times, she exploded, hurling curses at him and demanded an explanation on why he didn’t help her.

“That’s a stupid question to ask,” he said. “Of course I won’t like to help you bring in the groceries. If you ask me if I will help you bring them in, that’s another matter.” It was as if Marge had submitted a program into the TX-0, and the program, as programs do when the syntax is improper, had crashed. It was not until she debugged her question that Bob Saunders would allow it to run successfully on his own mental computer.

Hacker ethic

Wikipedia accurately explains the “hacker ethic” as a generic phrase that describes the moral values and philosophy that are standard in the hacker community. The early hacker culture and resulting philosophy originated at the Massachusetts Institute of Technology (MIT) in the 1950s and 1960s.

The term “hacker ethic” is attributed to journalist Steven Levy, as described in his book titled Hackers: Heroes of the Computer Revolution, written in 1984. The guidelines of the hacker ethic make it easy to see how computers have evolved into the personal devices we know and rely upon today.

The hacker ethic was a “…new way of life, with a philosophy, an ethic and a dream.” However, the elements of the hacker ethic were not openly debated and discussed; rather they were accepted and silently agreed upon.

Free and Open Source Software (FOSS) has evolved from the hacker ethics that Levy described. The hackers who stay true to the hacker ethics — especially the Hands-On Imperative — are usually supporters of the free and open source software movement.
The general tenets of the hacker ethic are:

  • Access to computers — and anything that might teach you something about the way the world works — should be unlimited and total. Always yield to the Hands-On Imperative! Hackers believe that essential lessons can be learned about the systems — about the world — from taking things apart, seeing how they work, and using this knowledge to create new and even more interesting things. They mostly resent any person, physical barrier, or law that tries to keep them from doing this.
  • All information should be free. If you don’t have access to the information you need to improve things, how can you fix them? A free exchange of information, particularly when the information is in the form of a computer program, allows for greater overall creativity.
  • Mistrust authority — promote decentralisation. The best way to promote this free exchange of information is to have an open system, with no boundaries between hackers and a piece of information, or an item of equipment that they need in their quest for knowledge, and their time online. The last thing they need is a bureaucracy, whether in the corporate world, in government, or at university. Bureaucracies are flawed systems, dangerous in that they cannot accommodate the exploratory impulses of true hackers. Bureaucrats hide behind arbitrary rules (as opposed to the logical algorithms by which machines and computer programs operate): they invoke those rules to consolidate their own power, and perceive the constructive impulse of hackers as a threat.
  • Hackers wish to be judged by their hacking, not bogus criteria such as degrees, age, race, or position. Hacker cultures are meritocracies, where positions are based on demonstrated knowledge and achievements. Hackers care less about people’s superficial characteristics than they do about their potential to advance the general state of hacking, to create new programs to admire, to talk about new features in the system, etc.
  • You can create art and beauty on a computer. Hackers deeply appreciate innovative techniques that allow programs to perform complicated tasks with a few instructions. A program’s code is considered “beautiful” in its own right, having been carefully composed and artfully arranged. Learning to create programs that used the least amount of space almost became a game between early hackers.
  • Computers can change your life for the better. This belief is subtly manifest. Rarely will hackers try to impose their view of the myriad advantages of the computer way of knowledge to an outsider. Yet, this premise dominates the everyday behaviour of hackers. For sure, the computer has changed their lives, given it a focus, enriched it, while making it more adventurous. It has made hackers masters of a certain slice of the world. Since all this is so obvious to hackers themselves, they believe that surely everyone could benefit from experiencing this power. Surely everyone could benefit from a world based on the Hacker Ethic. This is the implicit belief of hackers, and they irreverently go beyond what is conventionally expected of computers — leading the world to a new way of looking at and interacting with computers.

The last two points of the traditional ethics perhaps do not seem surprising today. They must be understood in their historical context. In the 70s, computers were strange and unfamiliar to most people. In cases where they meant something, it was mostly to do with administrative data processing, computing centres, punch cards and Teletype interfaces. Art, beauty and life changes were not mainstream notions associated with computers.

The hacker attire

Hackers dress for comfort, function, and minimal maintenance rather than for appearances (some, perhaps unfortunately, take this to extremes and neglect personal hygiene). They have a very low tolerance for suits and other “business” attire; in fact, it is not uncommon for hackers to quit a job rather than conform to a dress code.

When they are somehow pushed into conforming to a dress code, they will find ways to subvert it, for example, by wearing absurd novelty ties. Most hackers I know consider a tie as a strangulation device that partially cuts off the blood supply to the brain…which explains the behaviour of tie-wearers. A tie could bestow upon you the reputation of a super-loser, a suit-wearing super-user with no clue — someone with root privileges on a UNIX system but no idea what he is doing; the equivalent of a three-year-old with an AK-47 for a toy. In times of dire stress, he may roll up his sleeves and loosen the tie about half an inch. It seldom helps.

Female hackers almost never wear visible makeup and many use none at all.

How to become a hacker

In his essay by the same name, Eric Steven Raymond lists out, among other things, the basic hacking skills for wannabe hackers. He recommends the following five languages — Python, Java, C/C++, Perl and Lisp.

Python. It is cleanly designed, well documented and relatively kind to beginners. Despite being a good first language, it is not a toy; it is very powerful and flexible, and well suited for large projects. Paul Graham points out that many hackers use Python because they like the way source code looks.

That may seem a frivolous reason to choose one language over another. But it is not as frivolous as it sounds — when you program, you spend more time reading code than writing it. You push blobs of source code around the way a sculptor does with blobs of clay. So a language that makes source code ugly is maddening to an exacting programmer, as clay full of lumps would be to a sculptor.

Java. Eric Raymond suggests that Java is a good language to learn to program in. Most hackers today may not agree. Their main objection is that Java is not malleable. The malleability of the medium while programming is part of the process of discovery, which includes understanding all the requirements and forces — internal or not — that a system must be designed around.

James Gosling, best known as the father of the Java programming language, in his paper “Java: An Overview”, says, “Very dynamic languages like Lisp, TCL and Smalltalk are often used for prototyping. One of the reasons for their success at this is that they are very robust… Another reason … is that they don’t require you to pin down decisions early on. Java has exactly the opposite property: it forces you to make choices explicitly.”

The difference between languages like Lisp and Java, as Paul Graham points out in his bookHackers and Painters, is that Lisp is for working with computational ideas and expression, whereas Java is for expressing completed programs.

As James Gosling says, Java requires you to pin down decisions early on. And once pinned down, the system — which is the set of type declarations, the compiler, and the runtime system — makes it as hard as it can for you to change those decisions, on the assumption that all such changes are mistakes you’re inadvertently making. The effect is like having governors (speed limiters) fitted in your off-roader to prevent fools (you?) from doing too much damage. Hackers don’t like a language that talks down to them. Hackers just want power.

C/C++. If you get into serious programming, you will have to learn C, the core language of UNIX. C++ is very closely related to C; if you know one, learning the other will not be difficult. Neither language is a good one to try learning as your first, however.

Perl. Another language of particular importance to hackers is Perl, which is worth learning for practical reasons. It is very widely used for active Web pages and systems administration, so that even if you never write Perl, you should learn to read it.

Lisp. The truly serious hacker should consider learning Lisp. Lisp is worth learning for the profound enlightenment experience you will have when you finally get it; that experience will make you a better programmer for the rest of your days, even if you never actually use Lisp a lot.

Get Linux

The single most important step any newbie can take toward acquiring hacker skills is to get a copy of Linux or one of the BSD-Unixes, install it on a personal machine, and run it. Yes, there are other operating systems in the world besides UNIX/Linux. But they’re distributed in binary — you can’t read the code and you can’t modify it.

Trying to learn to hack on a Windows machine is like trying to learn to dance while wearing a body cast. Besides, UNIX is the operating system of the Internet. While you can learn to use the Internet without knowing UNIX, you can’t be an Internet hacker without understanding UNIX. For this reason, the hacker culture today is pretty strongly UNIX-centred.

So, bring up a Linux. Learn it. Run it. Tinker with it. Talk to the Internet with it. Read the code and modify it. You’ll get better programming tools (including C, Lisp, Python and Perl) than any Microsoft operating system can dream of. You’ll have fun and you’ll soak up more knowledge than you realise, until you look back on it as an ace hacker.

Are you a hacker?

You must earn the title of “hacker”, rather than just claim it. The same essay by Eric S Raymond that I mentioned earlier offers some invaluable tips on how to earn your status in hackerdom. Among others, there are two main things you can do to be respected by hackers (for a complete list, read “How to become a hacker” by Eric S Raymond).

Write open source software. The first (the most central and most traditional) is to write programs that other hackers think are fun or useful, and give the program sources to the whole hacker culture to use. Hackerdom’s most revered demigods are people who have written large, capable programs that met a widespread need, and given them away, so that now everyone uses them.

Help test and debug open source software. They also serve hackerdom who stand and debug open source software. In this imperfect world, we will inevitably spend most of our software development time in the debugging phase. That’s why any open source author will tell you that good beta-testers (who know how to describe symptoms clearly, localise problems well, can tolerate bugs in a quickie release, and are willing to apply a few simple diagnostic routines) are worth their weight in rubies. Even one of these can make the difference between a debugging phase that’s a protracted, exhausting nightmare and one that’s merely a salutary nuisance.

If you’re a newbie, try to find a program under development that you’re interested in and be a good beta-tester. There’s a natural progression from helping test programs to helping debug them and then on to helping modify them. You’ll learn a lot this way and generate good karma with people who will help you later on.

To end, I will have to quote Eric S Raymond yet again, since he puts it so beautifully:

We half-joke about ‘world domination’, but the only way we will get there is by serving the world. That means you and I; and that means learning how to think about what we do in a fundamentally new way, and ruthlessly reducing the user-visible complexity of the default environment to an absolute minimum.

Computers are tools for human beings. Ultimately, therefore, the challenges of designing hardware and software must come back to designing for human beings — all human beings. This path will be long, and it won’t be easy. But we owe it to ourselves and each other to do it right. May the open source be with you!

References
  • How to Become a Hacker, an essay by Eric Steven Raymond
  • Hackers: Heroes of the Computer Revolution, by Steven Levy
  • The Daemon, the GNU, and the Penguin, by Peter H. Salus
  • OpenSources, by Chris DiBona, Sam Ockman and Mark Stone
  • The New Hacker’s Dictionary, by Eric S Raymond, MIT Press
  • Biopunk: DIY Scientists Hack the Software of Life, by Marcus Wohlsen
Hackers and the Open Source Revolution