Jetty and JNDI and Atomikos.

Yes yet another “hey I just found this out, so you need to know it too”. And by the way, if you do find this useful, I would really like you to comment on this post so that other people will find it too.

I need to set up Jetty 8 to use a JTA provider. Quick googling around suggested that I need to take into account the following:

The kicker being of course that those documentation resources are out of date. Pause here for a quick rant: please, so that you do not go to a special hell reserved for people who cause huge numbers of lives to be wasted, if you are releasing code and documentation into the wild: update the frakking documentation!. Sadly, most google searches for anything about Jetty go to those elderly Codehaus pages, rather than the more up-to-date but obscure pages at http://wiki.eclipse.org/Jetty

In order to get the combination of Jetty 8.x and Atomikos 3.8.x and JNDI working, you need to look in the right place, and put the right class path in your Jetty JNDI definition:


<New id="tx" class="org.eclipse.jetty.plus.jndi.Transaction">
<Arg>
<New class="com.atomikos.icatch.jta.UserTransactionImp"/>
</Arg>
</New>

and to copy the following from the Atomikos lib/ directory into the Jetty lib/ext/ directory:


geronimo-j2ee-connector_1.5_spec.jar
geronimo-jms_1.1_spec.jar
geronimo-jta_1.0.1B_spec.jar

and finally to copy all the JARs from the Atomikos dist/ directory into the Jetty lib/ext/ directory.

Note that I have not yet tested this exhaustively, and suspect that not all of these JARs are needed.

Geronimo!

Ok. More adventures with open source. One of the things I’ve got on my (lengthy) list at the moment is to have a look at some light(er) weight servlet and J2EE containers. JBoss is giving me hives. You may be aware of that, I’ve mentioned it before.

So the first one I wanted to look at was Geronimo, partly because it’s from Apache, party because it’s got the option of being wrapped around either Jetty or Tomcat. I trotted off, grabbed the 2.2.1 tar ball and threw it onto my Mac so that I could run it up on the train. That’s where the irritation started. From the documentation, it was evident that in other *nix environments, I should just be able to unpack the tar ball and run bin/geronimo.sh run. I tried that on my Mac, and was hit by the dreaded Unable To Decrypt error.

It was pretty obvious that there were two parts to the solution: get a full JDK installed on the Mac, and ensure that the run time environment for Geronimo has JAVA_HOME pointing to the right place.

I could rant endlessly about Apple’s arcane treatment of Java, but won’t. If you have a developer account, it’s reasonably easy to grab a fairly recent JDK and get it installed. What’s not so obvious is where the hell the JDK winds up on your machine after install. Apple aren’t particularly helpful with this: the JDK winds up in /Library/Java/JavaVirtualMachines. Hence for me I needed to add JAVA_HOME to my profile:


JAVA_HOME=/Library/Java/JavaVirtualMachines/1.6.0_31-b04-415.jdk/Contents/Home

That sorted Geronimo out nicely – the boot log showed the right JARs were being found:


Runtime Information:
Install Directory = /Users/robert/Desktop/geronimo-jetty7-javaee5-2.2.1
Sun JVM 1.6.0_31
JVM in use = Sun JVM 1.6.0_31
Java Information:
System property [java.runtime.name] = Java(TM) SE Runtime Environment
System property [java.runtime.version] = 1.6.0_31-b04-415-11M3646
System property [os.name] = Mac OS X
System property [os.version] = 10.7.3
System property [sun.os.patch.level] = unknown
System property [os.arch] = x86_64
System property [java.class.version] = 50.0
System property [locale] = en_US
System property [unicode.encoding] = UnicodeLittle
System property [file.encoding] = MacRoman
System property [java.vm.name] = Java HotSpot(TM) 64-Bit Server VM
System property [java.vm.vendor] = Apple Inc.
System property [java.vm.version] = 20.6-b01-415
System property [java.vm.info] = mixed mode
System property [java.home] = /Library/Java/JavaVirtualMachines/1.6.0_31-b04-415.jdk/Contents/Home
System property [java.classpath] = null
System property [java.library.path] = {stuff in java.home}

Nice! Trouble is – the problem persisted. Here’s the trick: the initial install of Geronimo creates various properties files. If that initial startup fails, the properties files have borken information in them, and you will never get it to startup. Let me repeat that:

You MUST get JAVA_HOME in the Geronimo environment pointing to a valid JDK before you try to run Geronimo, or daemons will fly out of your nose and eat your face. You have been warned.

Sharpening the tools

Before I packed it all up, I had a workshop habit that I suspect other makers of sawdust shared. Before embarking on any work, I would spend some time cleaning up the workshop area. I would make sure the bench was clean and clear, sweep the floor, check that tools were sharp and sharpen them if necessary. I’d check the tables on the big tools for rust, and the tracking on the bandsaw. Sometimes I would get out particular tools and lay them out on the bench, ready to go.

And all the while I was doing this, I would be thinking about the work I was going to engage on, think about the processes I was going to follow, the pattern of work. I find this enormously relaxing, and a fantastic way to focus. I found that I would be trimming away, sweeping away, everything in my head that I didn’t need for the job at hand.

Over the past few years I’ve been trying to take the same approach to programming work, with a similar resultant focus (and one day I hope that I find it relaxing too). Thus, today, I’m re-reading Better Builds with Maven. Sharpening the tools and cleaning the bench.

One nice thing about re-reading books like this, particularly well written ones, is that there is always something to learn, some nuance that pops out that was previously invisible, highlighted by fresh experience. Even the import of a simple statement like “convention over configuration” can change over time.

Just like buses

You wait forever, and then two turn up at the same time.

Which is what happened with job offers on Friday. In a couple of days time, when contracts have been signed, I’ll tell you which two companies, and why I chose one over the other, but suffice it to say that I found myself in the remarkable position of having two really good offers come in within a couple of hours of each other.

I’ve dealt with a lot of agencies and agents while I’ve been hunting for work in London. Most of them have been ok, a few of them have felt incredibly dodgy, and three of them proved to be very good. Maybe I’m biased since these were the ones that got me the best chances, but it did feel that these three companies seriously thought about my reported history and interests, and made intelligent and dedicated attempts to match that against client needs. So, some free advertising for them. If you’re looking for work here in London, I strongly suggest you talk to these guys and gals:

  • ABRS went out of their way to put interesting things in front of me;
  • Salt have a good focus on marketing candidates and helping candidates market themselves
  • Bearing have an excellent understanding of the market sectors they aim to service, and a good understanding of technology

I’d also like to throw some laurels in the direction of BITE Consulting. They’re a small shop, with a fairly specific aim – placing people into contract roles, generally folk sourced from the colonies – but the products they offer to contractors are extremely competitive and sensible. If I’d been pursuing contract work (which I would have turned to if these permanent positions hadn’t popped up), there is zero doubt in my mind that I would have worked through and with BITE.

We now return you to regular programming.

Agility

A too frequent question over the past couple of weeks has been “what characterises Agile development for you?”, or some variant on that. Thinking about it this evening, I am struck by how much stuff has accumulated around what is a very simple, elegant manifesto. Go look at the WikiPedia article to see how much has been built around the idea.

In particular, the attitude I keep confronting is, loosely, “Agile = SCRUM”. A few people extend that to “Agile = SCRUM + CI”. More commonly I’m seeing the equation “Agile = SCRUM + Testing”, which is interesting, because what people are really articulating is “Agile = XP + some vague notions of short release cycles”. I really think that in a lot of places, Agile has become as much of a vast over-engineered framework as any older methodology. On the other hand, I am a curmudgeon.

It’s worth restating the agile methodology though, ripped straight from the site:

We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:

Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan

That is, while there is value in the items on the right, we value the items on the left more.

Words to live by, or at least to work by.

Mirror World…

Someone, probably William Gibson speaking through Cayce Pollard, wrote about the Mirror World of travel. The notion that it is not the big differences that lend to a sense of unease, but the myriad small and barely noticeable differences. Like the shape of electrical outlets.

An odd one for us, that we’re still getting used to. In the UK they drive on the left. But expect people standing on the escalators to be standing on the right. On the other hand there’s a general convention to go up and down the left side of stairs. And it’s ok to take dogs on the escalators, as long as you carry them.

But the one that is driving me absolutely mad, as I’m in and out of job interviews that involve coding tests performed at a computer is that the standard over here is for the “@” to not be Shift-2, but instead to be over where I’m used to finding the double quote, i.e. one right-ward twitch of my little finger. For no readily apparent reason, the two are reversed. Which makes touch-typing code very annoying.

Scope Creep

One thing that I’ve noticed over a quarter of a century of banging out code is how the expectations for what a coder will know has expanded enormously.

When I began, the expectation was that you had an understanding of how computers roughly worked – the old CPU plus memory plus storage model that we’ve had since the beginning – and facility in one or two languages. Cobol, Pascal, Basic, Fortran, some assembler. It was anticipated that you’d be able to sit down with a manual and a compiler, and teach yourself a new language in a few days. The important part was knowing how to think, and how to really look at the problem. And of course, how to squeeze every last cycle out of the CPU, and do amazing things in a small memory footprint.

Around the turn of the century, there was not much change. Your average coder was expected to be comfortable working in a three-tier architecture, to have some vague idea about how networks and the internet worked, be comfortable with SQL and a database or two, to have some notion of how to work collaboratively in a multi-discipline team. And of course, to have a deep understanding of a single language, and whatever the flavour-of-the-month framework or standard libraries existed. UML and RUP were in vogue, but Agile was newfangled, and here was a wall of design documentation to ignore.

Now is the age of the ultra-specialist. You need to be server side, or middleware, or client side. You need to know a language intimately, and be vastly knowledgable about half a dozen ancillary technologies – in the Java world, for instance, you need to grok Spring, and JMS, and JMX, and Hibernate, and Maven, and a CI tool, and a specific IDE. You need to understand crypto, and security, and enterprise integration and architectural patterns, and networking.

I fear that this rant has gone vague and off the rails. There is a strange paradox in place now: we are expected to specialise deeply in the problem spaces we address, but carry in our heads a hugely expanded toolset.

Making a Mockery

As I’m back on the hunt for a job, I’m going back and brushing up on technologies I’ve not necessarily used for a while, out of interest as much as anything else. This has been enlivened somewhat by realising I didn’t have and IDEs or any other coding tools – other than Xcode and TextWrangler – on my laptop. The effort of getting things set back up so that I can play has reminded me of why I love, and why I loathe, the OpenSource Java community.

I’ll write up some notes on bits and pieces that I want to remember as static pages elsewhere, so for the moment let me just mutter about what I got running, and what has driven me nuts.

To start with, it pleases me no end that Mac OS-X comes natively supplied with Maven. That gives me some base assurance that I can, at a minimum, launch a terminal and build and test without having to download the world first (other than Maven’s habit of downloading the world, of course).

Next up, I grabbed down Netbeans 7.1. I’d not used it with Maven previously, and was pleased to discover that the IDE plays nicely the Maven way, rather than desperately wanting to make Maven work the IDE way.

Penultimately, I got an Eclipse running – the SpringSource Indigo bundle – and began to grit my teeth. Don’t get me wrong. I like Eclipse a lot. But for the last eight months I’d been using IntelliJ, which means my brain and fingers had been retrained to different shortcuts, and trying to switch back to Eclipse is like trying to remember a language you’ve not spoken since high school. The other thing which always makes me grit my teeth is the richness of Eclipse. It’s the EMACS of IDEs, infinitely variable and configurable, and trying to get a fresh download to look and feel like you are used to is painful, annoying and tedious.

And finally, I sat down to fiddle with JMock again. And spent some time tearing my hair out. I was reading through a tutorial from the JMock site, and was damned if I could get it compiling. It was pretty obvious why – the JMock objects I was expecting weren’t in the JMock JARs. Maybe it was too late at night for me to be thinking straight when I downed tools, as picking it up again this morning revealed my problem: the tutorial was referring to a slightly older version of JMock, even though it was in a JUnit 4 context, and the current version is radically different.

Therein lies one of the things that drives me absolutely nuts about the open source Java Community: too many major, key, central projects have inaccurate, out-of-date documentation, and too much key knowledge is passed around in folklore.

Only if my hair is on fire.

I think I need to educate, or re-educate, my cow-orkers to understand what it means when I put on headphones while working. And there are one or two that I really need to tell that I cannot hear them if they come up behind me and speak softly to attract my attention. On the other hand, most of the reason is that I have run out of attention to spare.

There are certain classes of IT problems that end up occupying my entire consciousness and are extremely difficult to let go of when I walk out the door, particularly if they take several days to resolve. Maybe physicists and philosophers have better mental work benches, and can put the work down to re-emerge from their deep congnitive dives without the bends. I can’t.

If the nature of the problem is both time-bound and space-bound, I need to disappear inside my own head. What I mean is when the symptoms of the problem and the behaviour of possible contributors is spread across human-scale rather than machine-scale time, and where more than one thread of operation is in play, where computation is smeared across the possibility space.

I really have no perfect tool for disecting these sorts of problems. My workbench is scattered with a variety of tools for working on different parts of the problem. If you looked over my shoulder you would usually see that I have a text file open called “notes” or “defect xyz”, which is a mix of apparently context-free reminders to myself and a scantily sketched monologue as I propose and reject different theories. You would usually see a paper notepad with faint pencil scribbles, and a variety of abstract diagrams, mostly scratched out. I would probably have an IDE open with code highlighted, and a terminal window showing logs. What you cannot see is what’s in my head: elaborate mental models of what I believe to be the space-like computational state smeared across the problem time. The visible symbols are just reminders, annotations, histories of abandoned models.

There are two implications of this. First, I can’t put it down when I go home, or to eat, or to sleep. A sufficiently complex set of models will take up all my thoughts, there’s just no room in my head for any other sensible responses or rational thoughts. I become a dreamwalking zombie. Second, and possibly most pertinently: if you ask me to take my headphones off and pay attention to you, there’s a very high probability that the mental model currently being constructed will collapse, and I have to start from the beginning again. Your five minute interruption will probably blow an hour or more’s work.

So please. If I’ve got my headphones on, please, please don’t ask me to emerge from my fugue state even if the room is on fire. Only if it has spread far enough that my hair is burning.

The Trouble With Passwords (Again)

Part of my efforts to grab my life by the corners and twist it into a different shape was a decision to switch my “primary” computer to be a laptop, rather than the ailing iMac. I’ve almost finished making that move, and have just a few things to move across from the old machine onto this laptop. So I sat down last night to recover some passwords and account information that I had been missing that I knew was in the Keychain on the old machine. And there the hassle began again.

It’s been pointed out, and I’ve ranted about it in the past in different forums, that the Mac OS X Keychain is a parson’s egg. It does a really good job of noting authorisation credentials for software running as the current logged in user, pretty well invisibly, silently and hassle free. Most software that needs authentication credentials has been written correctly to use the Keychain, and as long as nobody swipes both the keychain file and the master password, it’s reasonably secure.

Where the Keychain Access program falls down badly though is usability for a specific but pretty common use-case: being able to bulk-export credentials for import to a different keychain.

It’s not that Apple are unaware of this as a failing in the product, their support forums are littered with people asking how to do a bulk export, and the response is always the same – use the Migration Assistant to move the whole account from one machine to another. And there’s the fallacy in their design world view: Apple desig software with the belief there is a one-to-one relationship between a user and a user account on a single machine. For all their talk about cloud services, they still have this vision of a single user with a single user account instance publishing to the cloud. Bzzt. Wrong. It’s only loosely true for most users, and very wrong for the minority that for one reason or another have different accounts, potentially on different computers, for different uses and contexts.

The canonical and simple example is where I was a few months ago – a main desktop which was a document repository and work bench and media player, and a laptop which contained a subset of documents that were currently being worked on. And a computer at my work place with some internet connectivity, and a strict injunction against plugging private devices into the network. Oh, and the FrankenPuter Windows 7 box I built for games. Getting this to work, in general, was fairly straight forward – I used ChronoSynch to keep specific folders in synch, and Spanning Sync to keep calendars and addresses in synch between the two computers and Google. Using IMAP for Gmail kept mail sort of in synch, and Chrome’s facilities for synching bookmarks between instances via Google works ok.

But two things did not work at all well. There was no good way to keep two instances of Things in synch (but they are [working on that]), and absolutely no way to keep credentials and secure notes in synch (caveat, no way without committing to drinking the 1Pass kool-aid, which I may yet do).

I sat down on Monday night to finally get all the passwords out of the iMac keychain and onto the laptop somehow. Exercising Google-Fu, I found a pretty good AppleScript solution which did the trick, even if it had to deal with the annoyances of the Keychain. The trick was to unlock each keychain before running the script, then for each item in each keychain, as the script was running, click “Allow” on the two modal dialogs that Apple threw up. Somewhere over 300 clicks later, I had a text file with pretty well all I needed in it, and a firm decision to leave the data in a text file for reference, and not muck about trying to get it into the laptop keychain (See, I’m already thinking that 1Pass might be the better solution).

The next part of the puzzle was to get it onto the laptop. Now I’m slightly paranoid about things like this, and wanted to have at least a third copy while I got it across. Ok, it was late at night, and I wasn’t thinking straight. I’ve misplaced my last USB thumb drive (damn, need another), so decided to toss the file onto [DropBox] to aid in the transfer. Which led to the next issue: there was no way I would throw this file into the cloud without it being encrypted, and hard encrypted.

Ok, easy solution there – encrypt it with PGP. Done. Now to install PGP on the laptop… wait a minute, when did Symantec buy up PGP? And they want how much for a personal copy? (As an aside, for an example of entirely obfuscating costs and product options, the Symantec PGP subsite is a masterpiece). When it comes to companies I am loathe to entrust with protection of my secrets, Symantec is pretty high on the list. Ok, second plan, grab MacGPG. I’ve used earlier versions, and have used GPG and its variants on other platforms, and am confident in it. On the other hand, I really miss the point-and-click integration of MacPGP. Fortunately there’s a project under way to provide a point-and-click interface on top of the underlying command line tools, and I’m pretty happy with what they are doing. If you need it, go check out GPGTools, but be aware that you’ll probably need some of the beta versions of stuff – the stable release at the time of writing doesn’t provide an interface for decrypting files. The only thing I’m unhappy about is that it automagically decrypts files for me, without prompting for the pass phrase. So while it’s good for protecting the file in the cloud, it’s not so great for protecting the local copy (yes, I know that there’s little protection if someone swipes the laptop).

Which leaves me with the old hack – create an encrypted DMG with the file(s) in it. It’s a pretty straight forward process:

  1. Run Disk Utility
  2. select “New Image” and specify one of the encryption options. Other than the size and name, the rest of the options can be left as their default.
  3. copy the files into the new DMG
  4. there is no step 4

The only alarming gotcha is that it appears that you can decrypt the image without providing a credential, if you have allowed Disk Utility to store the pass phrase in your keychain. The trick is twofold – first, credentials are kept in a cache for a few minutes after use so that you usually don’t have to provide them in rapid succession. You can flush the cache by locking the keychain again. The second part is that by default the keychain remains unlocked after login. You can tweak these settings by going into the preferences for Keychain Access – I like to select “Show Status in Menu Bar”, and deselect “Keep login chain unlocked”.

All of which takes me off on a ramble from what I was thinking about. It seems to me like the battle to allow and encourage strong personal encryption and digital signing has been abandoned, and the focus has shifted purely to secure use of online services. There are a few personal file protection products on the market, of unknown and unverified strength, and a few more business focussed products. The intended widely available public key infrastructure for general public use never eventuated, subsumed instead by an industry focussed around providing certificates for Web sites and certificates for B2B secure communications.

Apple provides File Vault as a means to encrypt the entire disk, and there are similar products available for various versions of Windows, but the trouble remains that for encrypting a subset of files the software remains dodgy or highly technical. And don’t get me started on digital signatures on mail.