Thursday, April 21, 2005

Hello, IDEA!

From the recent Java IDE discussions, it seems like there's a good portion of Java programmers who don't know IntelliJ IDEA, or simply haven't tried it yet. In this short screencast, I cover all the basics you'll need to get started using IDEA in about 10 minutes. Watch "Hello, IDEA!" No fancy tricks like my last screencast, this demo is simple and easy.

Do you have a friend or colleague who hasn't tried IDEA? Send them a link. I'll be posting more advanced feature-specific screencasts soon. Stay tuned.

Monday, April 11, 2005

Language Oriented Programming: Everything is a Language

Some people don't 'get' Language Oriented Programming. It's a different perspective. Once you make the mental shift, everything starts to fall into place. Over the past few months, I've been looking at the software development world with LOP-coloured glasses, and I've come up with a quick zen-slap phrase which I hope will help shift your paradigm.

(If that last sentence made you think, "I hate the word 'paradigm'! It's a meaningless marketing buzzword," then read this before reading on below.)

Everything is a Language
Just as OOP introduced 'everything is an object', I propose for LOP that 'everything is a language'. Keep this in mind as you read about and discuss software, and suddenly things will look very different to you. But what the heck do I mean?

Programming is Communication
When you want to communicate an idea to someone, how do you do that? You use the tool that Homo Sapiens 1.0 has evolved over millions of years: Language. You'll speak, you'll make gestures. If you want the communication to last over time, you'll write it down in symbols. These days you can use fancy tools like keyboard, mouse, digital memory, and graphic display to write these symbols.

Now, when you want to instruct a computer, how do you do that? You write symbols using keyboard, mouse, digital memory, and graphic display. You're doing the same things, but instead of communicating ideas to people, you are communicating ideas to the computer. Now, of course, there are a couple of important differences, such as the fact that there are strict rules on which symbols are used and how they are arranged, and also the fact that the symbols have a very precise, concrete meaning (e.g. '2 + 2' actually performs a physical action in the computer to produce the symbol '4'). However, at the end of the day, a programmer's work primarily involves communicating to the computer via language.

The Editor is the Medium
Before I go into the examples, there is one important thing to know. Language does not exist without a medium. Spoken words require a mouth to speak, air to propagate sound waves, and an ear to listen. Sign language requires hands to make the signs, light to carry the image, and eyes to read the signs. Writing requires a pen, paper, and eyes. Programming requires a keyboard to type, an editor to accept the input, and a compiler or interpreter to execute the program. Take away any of these things, and communication is much more difficult.

Marshall McLuhan tried to emphasize this fact by saying that "the medium is the message". He, perhaps, overstated his case for emphasis. But the fact remains that the medium of language is an important, indispensable part of the language. So I will now put '2 + 2' together to state something that will likely bother some hard-core programmers: The editor is an important, indispensable part of the programming language. The editor is the medium of the programming language.

When you start to think 'everything is a language', you will start to see computer languages everywhere.

JUnit and TestNG
JUnit is the most popular unit testing framework, but there are many clones and variations. They are all languages. With a unit test, you attempt to communicate to your system that, when the environment is set up in this way, this is what you expect, and this is what actually occurs. But have you noticed how ugly it is to write a test that says "this method should throw this exception"? That is because JUnit is written in Java, and the only way to do this in Java is to wrap the method in a try/catch statement. JUnit is a language on top of Java. Notice how new frameworks such as TestNG try to modify Java (with annotations) to make writing tests more natural? That is because they want to be true languages.

EasyMock and jMock
The Mock Objects pattern is another language, and two popular frameworks again try to adapt Java into a mock-object language. EasyMock uses dynamic proxies to generate mock objects on the fly. But notice how you have to have this clumsy MockControl object to specify return values and exceptions? jMock goes further than EasyMock, allowing you to specify in a fairly natural way the expected inputs and outputs. But notice how you lose Java's language support for interfaces; instead you have to specify method names with Strings, making refactoring method signatures with an IDE more difficult. And while writing mock expectations may be fairly natural, it's actually not very natural at all; it doesn't look like a normal Java method call as it would with EasyMock. That's because both of these frameworks are languages on top of Java. I think it would be great if my editor allowed me to write out the expectations using special notation that isn't possible in Java, like this:

myObj.myMethod(obj equals expectedObj, date between startDate .. endDate) returns expectedResult;

XML this and XML that
Have you noticed the proliferation of all these XML dialects? Why is that? It's because XML gives you a standard and relatively easy way to invent your own languages. The parser is already built, all you need is a DTD for your syntax and to wire up the grammar to your Java (or other language) objects for your semantics. Ant is one example, but there are zillions more. Obviously, there is a need for the ability to easily create specialized languages. This is one of the primary forces behind Language Oriented Programming.

Struts, Spring, and Rails
Web frameworks are yet another kind of language we are seeing a lot of these days. Notice how Struts defines page navigation using XML, extends JSP with tag libraries, and provides special facilities with its class framework? I wonder why such a project as Struts Console exists, or the various IDE plugins. Could it be because Struts is a language? Spring wires up your application with a configuration script (hint: in its own specialized XML language). Rails rewrites entire classes, re-interprets method calls, and has dozens of other meta-programming tricks just to make your life easier. These frameworks are all attempts to work around the deficiencies of their host languages. Imagine the power that would be possible if they could be languages in their own right, with things like IDE support, refactoring, and whatnot. (I'm talking about LOP, people.)

Aspect Oriented Programming
Why isn't it popular? No IDE support. 'Nuff said.

Ruby, Python, PHP, and even JSP
Dynamic languages are on the rise. Why is that? Let's put on our LOP glasses. Static languages are great because you get lots of support from your tools. The compiler gives you lots of hints, and can do things like optimize your code. A smart IDE like IntelliJ IDEA can make judgments about code, allowing static analysis, refactoring, code completion, and navigation. But static languages also have some downsides. Every little detail has to be specified, sometimes more than once. Little errors can blow up in your face if you're not disciplined in your coding practices.

Dynamic languages make a trade-off. They try to be more flexible, and get out of the programmer's way, at the expense of losing some of the features of a static language. Sometimes, this trade-off is in the interests of the programmer. With more mature languages like Ruby and Python, the advantage is slowly tipping more and more in favour of dynamic languages. Wouldn't it be cool if you could have the power and flexibility of a dynamic language, with all the extra cool features of a static language, and a kick-ass editor to boot? Hmmm.

Keep your eye out for this stuff. It's going to become more and more obvious to you. All you have to remember is that everything is a language.

Thursday, April 07, 2005

Paradigm, Schmaradigm

Let's clear something up. Yes, 'paradigm' has been hijacked by evil marketing forces. No, it is not a meaningless word. The problem is most people don't know what 'paradigm' means, so marketers are able to twist it and turn it and slap it onto any old crappy idea. If everybody knew what it meant, it would lose all its mysterious power, and marketers would give it up for some more fuzzy, vacuous word, like 'folksonomy'. So with that goal in mind, I'm going to briefly educate you on the real meaning of paradigm and immunize you against the marketing buzzword of the same name.

The meaning of paradigm I'm using here comes from Thomas Kuhn in his famous book The Structure of Scientific Revolutions. A paradigm is a set of assumptions that cause you to think about things in a specific way. For example, assuming that the Earth is the centre of the universe might cause you to think that there is inherent order in the universe, with everything in its place, and with Earth holding a special, exalted position. The four elements, earth, water, air, and fire naturally seek their proper place. The sun, moon, planets, and stars revolve in perfect circles around the Earth. This paradigm, for two thousand years, helped people understand the world.

No paradigm is perfect, and there will be anomalies that the paradigm cannot explain. In the beginning, the anomalies won't seem important, and people will ignore them, or make small exceptions or corrections in the paradigm to account for them. But over time, as understanding deepens, the anomalies will gain significance until they are too problematic to ignore any more. For example, astronomers observed that the planets didn't move in perfect circles, so they invented the concept of epicycles to try to correct their model. Eventually, the required epicycles became so messy that the geocentric cosmology reached a crisis.

When a crisis is reached, new paradigms arise to compete against the prevailing paradigm, and eventually a new, usually better one wins out by eliminating the anomalies of the old paradigm. This process is called a paradigm shift. For example, by considering that the Sun is the centre of the universe, and that orbits are elliptical instead of circular, the need for epicycles vanishes. (Some people use the term 'paradigm shift' to refer to an individual person's internal shift in viewpoint from one belief system to another. I confess I also use the term that way, but I only apply it to genuine paradigms, not just any viewpoint shift.)

There's another important aspect to paradigm shifts, which is the historical fact that paradigm shifts tend to be nasty, with the old-guard staunchly defending their tried-and-true paradigm, and the new generation brazenly promoting their new-and-better paradigm. Sometimes, the only way to complete the paradigm shift is to wait for the old-guard to literally die off! For example, the Catholic church had so much invested in the geocentric paradigm that it took two generations of scientists (first Copernicus, later Galileo and Kepler) to establish the heliocentric paradigm among the mainstream scientific community. And we all know what happened to Galileo. Thomas Kuhn called such messy scientific paradigm shifts 'scientific revolutions', and in fact, the Copernican paradigm shift is considered the Scientific Revolution since it is one of the major events of the Renaissance.

One last important feature of paradigms is that they are often mutually unintelligible, and trying to explain something in the context of one paradigm, when the other person holds a radically different paradigm, can cause huge misunderstandings and pointless arguments. People talk right past each other, saying things that make perfect sense to themselves, but are completely meaningless to the other person. You can imagine the different answers you would get from a geocentrist versus a heliocentrist when asked the question, "Why are we here?"

The word paradigm, when used in the sense of Kuhnian paradigm shifts and scientific revolutions, is a very powerful and useful concept. It can help you understand major events in the world, and to see the recurring structure in them. So, now that you understand the real meaning of 'paradigm', you can just ignore the clueless marketing drones who babble about 'optimizing your paradigm', and not get so worked up over this perfectly good and useful word.

Tuesday, April 05, 2005

IntelliJ IDEA Skeptic Learns to "Develop with Pleasure!"

I couldn't pass on this classic. Blogger Doug reluctantly discovers TDD and IntelliJ IDEA. From "how do I create a new class?" to "living without it may be hard to do" in 4 days. Nice! My favourite part is the initial hard-core skepticism when he says "30 days is not long enough for me to evaluate" IDEA, followed by a somewhat unrealistic evaluation scenario: "It is resource hungry," he claims, "The box I'm using at the moment is a bit low on memory, only 256Mb... It's an HP e-Vectra I picked up at a computer auction for < $150 about a year ago." I mean, seriously! You can't expect to run a power drill on AA batteries. ;-)

Overall, I think Doug's experience is actually quite common. Switching IDEs can be time-consuming and mind-bending, especially if you've gotten deep into your previous IDE's habits and quirks. Each IDE has its own personality, and will take some getting used to. The most pragmatic approach is to keep your expectations realistic, have enough time and resources to give each IDE a fair shake, and then pick the best tool for the job. Different tools work better on different jobs, so it's good professional practice to know how to use multiple tools from your toolbox.

You download IDEA here, and JetBrains is currently offering a special personal license for half the regular price. Don't miss out.

Monday, April 04, 2005

Rails is Ruby's "Killer App"

Forget the hype over Rails -- not important. Forget the anti-hype too -- irrelevant. Instead, just look at this graph from See any correlation? Rails is a killer app, regardless of hype or anti-hype. As soon as Rails appeared on the scene, Ruby has experienced a 200% increase in web traffic over three months.

Personally, I think this is great. Ruby is a pleasure to program in, and provides a good alternative to other dynamic languages, and sometimes even the more-mainstream languages. Any growth in Ruby's adoption will drive innovation and open up Ruby for mainstream acceptance. Congrats to David Heinemeier Hansson for his great work on Ruby on Rails, and his successful evangelism of both Rails and Ruby.

Friday, April 01, 2005

Microsoft Facing a Classic Innovator's Dilemma

.NET bloggers are upset over the recently announced pricing scheme for Visual Studio .NET 2005. What the heck is Microsoft thinking? Well, it's called an Innovator's Dilemma. Basically, Microsoft is being forced up-market by cheaper and simpler open source tools. And when the JetBrains .NET IDE arrives on the scene, adding usability and intelligence to the equation, well... I'll let you draw your own conclusions.

An Innovator's Dilemma is when a company (or innovator) follows good, rational, sensible management and marketing strategies and still ends up failing, despite doing everything right. The term was coined by Clayton Christensen in his book The Innovator's Dilemma. Christensen establishes a six step process by which established companies are forced to move up-market:

Step 1: Disruptive technologies are first developed within established firms
Step 2: Marketing personnel seek reactions from their lead customers
Step 3: Established firms step up the pace of sustaining technological development
Step 4: New companies are formed, and markets for disruptive technologies are found by trial and error
Step 5: The entrants move up-market
Step 6: Established firms belatedly jump on the bandwagon to defend their customer base

There's an implied, unmentioned seventh step which I'll spell out for you.

Step 7: Too late. Too bad, so sad. Bye bye. ;-)

Do you remember a time when Visual Studio was the best IDE on the planet? I do. Then a little thing called IntelliJ IDEA came on the Java scene and took that honor. How did Microsoft respond? I'll invent a little scenario that probably isn't far from the truth:

MS Techie: "Our next Visual Studio should have all those cool features IntelliJ IDEA has."
MS Marketer: "Okay, let me ask our big customers what they want."
Big Company: "Well, VS already has add-ins like ReSharper for those cool features, and there's lots of open source tools for things like automated testing and builds. What we really want is support for our huge teams of Architects, Developers, and Testers."
MS Marketer: "Okay, let's push the envelope there and come up with a great product for big teams. We'll call it ... Team System! And since big companies are willing to pay big bucks, lets price it around $10,000 per year! I can just see the money rolling in."

That's about where Microsoft is right now, and that represents Steps 1 through 3. Step 4 is already happening as JetBrains improves on ReSharper 1.5 with the upcoming ReSharper 2.0. Step 5 is just around the corner, when JetBrains will follow up ReSharper 2.0 with their .NET IDE (tentatively named ReSharper IDE). Step 6 occurs when MS realizes that the .NET IDE is gobbling up all their customers. But by then it will be too late (Step 7).

Plug: ReSharper 2.0 will be released shortly after VS.NET 2005, with tons of new features, but you can start enjoying the pleasures of ReSharper 1.5 today, and the upgrade to 2.0 is free, so there's no need to wait. Get it now while it's still only $99 ($149 starting April 5th).