Lillian ran into a few minor problems trying to build ‘raw’ OpenJDK6 (i.e. the tarball direct from Sun without any of the numerous IcedTea patches, fixes and extensions) yesterday, and so I decided to give it a go as well. Unfortunately, it seems it still isn’t possible to build it out of the box on a modern GNU/Linux system. In the end, I had to apply three patches (effectively two as one was split into two by the HotSpot build changes in IcedTea). patches/hotspot/original/icedtea-gcc-4.3.patch and patches/icedtea-gcc-4.3.patch from IcedTea6 is needed to fix some issues when building with GCC 4.3 which is the default on most current distributions. In released versions, this is a single patch, but in current Mercurial and the upcoming 1.4 release, the HotSpot changes in all patches are split off to enable the version of HotSpot used by the build to be changed. patches/icedtea-no-bcopy.patch was needed to remove some local defines of some BSD functions such as bcopy. There was some noise about taking this upstream on the mailing lists, but it’s not yet in a tarball it seems.

With these patches, I could build as follows:

$ mkdir openjdk
$ tar xzf openjdk-6-src-b14-25_nov_2008.tar.gz -C openjdk
$ cd openjdk
$ patch -Np1 < $ICEDTEA_SOURCES/patches/icedtea-gcc-4.3.patch
 $ patch -Np1 < $ICEDTEA_SOURCES/patches/hotspot/original/icedtea-gcc-4.3.patch
 $ patch -Np1 < $ICEDTEA_SOURCES/patches/icedtea-no-bcopy.patch
 $ cd control/build
 $ unset JAVA_HOME
  IMPORT_BINARY_PLUGS=false ANT=/usr/bin/ant ANT_HOME=/usr/share/ant

...sometime later...

>>>Finished making images @ Fri Jan 30 10:59:43 GMT 2009 ...
make[1]: Leaving directory `/tmp/openjdk/jdk/make'
Control build finished: 09-01-30 10:59
$ ../build/linux-amd64/j2sdk-image/bin/java -version
openjdk version "1.6.0-internal"
OpenJDK Runtime Environment (build 1.6.0-internal-andrew_30_jan_2009_10_46-b00)
OpenJDK 64-Bit Server VM (build 11.0-b17, mixed mode)

You can build a bit quicker if you make use of parallelisation which I neglected to do here; add


where PARALLEL_JOBS is how many processes you want to run simultaneously. The usual rule is number of cores plus one. IcedTea of course supports doing this too; just add --with-parallel-jobs=$PARALLEL_JOBS when you run configure and it will add the necessary make wizardry for you.

For the configuration, CURRENT_ICEDTEA_INSTALL should point to your existing IcedTea install. This is the only way to build OpenJDK6; I didn't try it this time round, but I know from experience that it still needs a fair few patches to remove Sunisms from the OpenJDK source code in order to allow a GNU Classpath JDK like GCJ to be used instead. This shouldn't be a problem though; you'll find IcedTea in Fedora (yum install java-1.6.0-openjdk), Gentoo (emerge -v icedtea6-bin), Ubuntu and Debian testing (aptitude install openjdk-6-jdk). The magic path is usually something like /usr/lib/jvm/blah where blah is java-1.6.0-openjdk on Fedora and /usr/lib/jvm/icedtea6 on Gentoo.

ICEDTEA_SOURCES points to a copy of the IcedTea tree. If you get this from a release tarball rather than hg, then you don't need to try finding the second gcc patch... you'd have a hard time doing so ;)

The OpenJDK build doesn't like environment variables like LD_LIBRARY_PATH and JAVA_HOME being set. It also complains about you having any fancy modern locale set in LANG so it's simplest just to run make with LANG=C. The JAVAC environment variable is a funny one; it allows the path to javac to be overridden for the build, but it doesn't only override the binary path as it should, but also drops all the necessary memory and classpath options passed to it. Thus, I don't see how any build which sets JAVAC could work... Gentoo seems to like to set both JAVA_HOME and JAVAC for some reason, so make sure they are unset before you build.

Of course, with the resulting build, you'll be missing a lot of features that IcedTea provides...

  • A web plugin
  • Java web start
  • PulseAudio sound support
  • Support for architectures like PPC, ARM and MIPS via Zero or CACAO
  • A faster more recent HotSpot if you use the current Mercurial tree or the upcoming 1.4 release
  • Lots of other lovely stuff provided by our patches including a more up-to-date version of Gervill, the results of the XRender pipeline project and support for testing your build with JTReg.

You also avoid the pain of having to remember those crazy make variables; it's just ./configure; make on most modern GNU/Linux systems; file a bug if that's not the case.

I still feel that the IcedTea project does a very important job in taking the raw materials provided by Sun and turning them into something useful. That's why you'll find it's IcedTea being shipped with all those distros, and not 'straight' OpenJDK (which in most cases actually means the version that may-eventually-be-1.7). Thanks to all the great developers for their continued efforts and for the support of Red Hat on this effort.

The other thing about your own build is you can't of course actually call it 'Java'; one of the things you need to do before you can is run it through the Java Compatibility Kit. It has lots of tests your build has to pass. Unfortunately, Sun still keep this as a horrible proprietary blob of code which means you have to sign up to get a copy, be 'approved' and then work on your testing in a dark clammy room in secret. Fortunately, the kind people at Red Hat have already undertaken this momentous task for you, so you can just grab one of the OpenJDK builds in Fedora which has already passed. Thus, according to Sun's FAQ, the Fedora binaries 'are compatible with the Java(TM) SE 6 platform' :)

Let's hope things continue to improve in the direction of making more things in the Java world open and Free. In the Free Java room at this year's FOSDEM (happening just next weekend), you'll be able to find out all about what's coming next and meet some of the brave people forging the new frontiers...

Happy hacking! :)

I’ve been meaning to write this blog for a while, but other things have cropped up in the meantime. The topic is something that came up when trying to lower the number of warnings produced by a build of GNU Classpath.

While GNU Classpath has required a 1.5 capable compiler since 0.95, so we could implement things like java.lang.Enum, the use of generics and such has only largely being applied to the creation of new classes (like java.util.ServiceLoader) and the suppression of JAPI differences. The internal code, such as that in the gnu.* packages and the private variables within the java.* and javax.* classes has remained in 1.4 form. As a result, the compiler generates a lot of ‘unchecked’ warnings, mainly due to the use of ‘raw’ collections (I’ll explain what these are shortly). ecj is still our preferred compiler (being the most tested and the one we had available as Free Software first) and it generates far more warnings by default than Sun’s javac. At present count, Classpath CVS generates just over 10,000 warnings with ecj 3.3. We clearly need to cut this down so we can spot real problems. In the past, we’ve simply turned them off but it’s better for the codebase in general if these are properly cleaned up and in some cases it does cause bugs to be discovered.

So what’s the problem? Well, mainly it’s a case of most of the GNU Classpath code still looking like:

List list = new ArrayList();


Map map = new HashMap();
map.put("key", new Value());

Map, List, ArrayList and HashMap are now referred to as raw types because the versions in Java 1.5 and above have one or more type parameters. These type parameters can be used to tell us what is stored inside the collection. We should be using Map<K,V>, List<T>, ArrayList<T> and HashMap<K,V>, the parameterized types. The type parameters, (K, V and T in this case) can be used in methods to specify that the type of an argument or return value depends on the type given when the collection is created. Thus, Map<K,V> has:

V put(K key, V value)

as opposed to:

Object put(Object key, Object value)

K is the type of key used for the map, and V is the value. In our original example, our Map should be replaced with Map<String,Value> because it maps keys of type String to values of type Value. The main advantage of using these is you no longer need to cast elements when they are returned from the collection. So a get call on a List<T> or Map<K,V> returns a value of type T or V respectively, not simply an Object which has to be cast manually by the user. In reality, backwards compatibility means these casts are still being inserted by the compiler, but for it to do this it must have been determined to be safe to do so.

So how do we clean up these warnings? It means going through the code and adding appropriate type parameters to our collections. This isn’t always as easy as it sounds. In some cases, the collection will only be used with one type so it’s simply a matter of determining what that is (usually by looking for the casts when objects are retrieved from the collection — these casts can soon be removed). However, because raw collections take Objects as input, there can be a mix of types so a common supertype has to be found.

In some cases, this has to be Object. So, you may think, what’s the point of turning Map into Map<Object,Object>? Surely they are the same thing. No they’re not, and this is one of the more interesting aspects of collections and one you’ll especially come across when you have to deal with using a raw collection coming from legacy code without generating unchecked warnings. Map the raw type is actually equivalent to Map<?,?>, where ? represents a wildcard. Wildcards allow the use of existential types; instead of having a strict instantiation of a type parameter, we can refer to any type within certain bounds. By default, a wildcard has an upper bound of Object and a lower bound of null i.e. Map<?,?> is the same as Map<? extends Object super null, ? extends Object super null>. What does this say in English? It says that the keys and values of the map are of some type that extends Object but we don’t know exactly what. In contrast, Map<Object,Object> says that the keys and values must be Objects.

We don’t want to work with wildcard types practically because we can’t have variables of type ?. Instead, we use them when importing and exporting from parameterized types. For example, to import objects from a collection, addAll is not defined as:

void addAll(Collection<T> t)

because doing so would mean that we can only take objects from collections of exactly the same type. Instead, we want to also allow collections of some subtype. For example, a collection of Objects should be allowed to be filled from a Collection of Strings. The above signature doesn’t allow this, but:

void addAll(Collection<? extends T> t)

does. This says that the collection from which the elements are to be taken must contain objects of some type which is a subtype of T. A similar example for super is the use of Comparator<? super T>. When we want something that can compare two objects of type T, we can work with both something that can compare two elements of type T but we can also use a more general comparison method that compares some supertype. For example, a Comparator<Object> can be used to compare Strings. In this case, we can’t go the other way; it would be inappropriate to try using a Comparator for Integers to compare general Number instances.

The most confusing aspect of dealing with generics is how to handle legacy code where you can’t make the incoming type a parameterized type. Take the following legacy method:

  public List createFruits()
    List x = new ArrayList();
    return x;

This returns a raw type, List. Now imagine we can’t see the body of the method. All we know is that the method returns a List; we don’t know what is in that list. This is how the compiler sees the method.

We of course know that it contains String objects. So we try and pass it to a method that takes a List<String>:

  public void printList(List<String> l)
    for (String s : l)

The obvious solution to do this is:


and this will compile, but it produces an unchecked warning:

warning: [unchecked] unchecked conversion
found   : java.util.List
required: java.util.List<java.lang.String>

So we take the obvious solution to this from the old 1.4 days and cast it:

printList((List<String>) testFruits())

Again, this compiles but we get a different unchecked warning:

warning: [unchecked] unchecked cast
found   : java.util.List
required: java.util.List<java.lang.String>
    printList((List<String>) createFruits());

So we get a warning with the cast, and one without. What do we do?

The answer is to take a step back and think about what this incoming List really is. As we noted before, the equivalent of List in the new 1.5 world is List<?> so:

List<?> l = (List<?>) testFruits();

No warnings, so far so good. This is deemed a safe cast because we are merely telling the compiler to move from the 1.4 to 1.5 version of the same thing. But how do we change this into a List<String>?

Remember that the ? wildcard without explicit bounds is telling us that the most we know about the contents of the list is that they are some subtype of Object. As such, the only thing we can safely retrieve them as is Objects:

    List<Object> newList = new ArrayList<Object>();
    for (Object o : l)

This takes each Object from the list and puts it in a new list which holds Objects. By doing so we have removed the doubt about what is in the collection and telling the compiler to simply treat them all as Objects. What we have now is equivalent to what we thought we had to start with. However, this still won’t work with our printList method:

printList(java.util.List<java.lang.String>) in Test cannot be applied to (java.util.List<java.lang.Object>)

This is an error so the code will now not even compile. This is good; we don’t want the compiler to allow us to pass collections of mere Objects to methods requiring collections of Strings. This would take us straight back to Java 1.4 days. The solution is to create a List<String> instead of a List<Object> and check that each objects is a String in the body of the loop which adds them to the collection.

    List<String> newList = new ArrayList<String>();
    for (Object o : l)
      if (o instanceof String)
        newList.add((String) o);

Finally we have working code which has no warnings, while still using the legacy code. This however can be quite inefficient; in some cases, we want to avoid iterating over the entire collection when we know this is going to happen anyway. A common place where this happens is using the addAll method of collections. The addAll will take each object and cast it in adding it to the list anyway (the retrieval from the producer list will generate such a cast). In these cases, we can use the annotation @SuppressWarnings(“unchecked”) to turn off the warning we know is superfluous.

This should be used with care. It should also cover the minimum area possible to avoid suppressing other warnings. Annotations can go on individual assignments so there is no need to suppress warnings for the entire method. For example, here is Classpath’s getAnnotation method:

  public <T extends Annotation> T getAnnotation(Class>T> annotationClass)
    // Inescapable as the VM layer is 1.4 based.
      T ann = (T) cons.getAnnotation(annotationClass);
    return ann;

cons.getAnnotation will return something of type Annotation as our VM layer is strictly 1.4 only. As we know from the input class that the annotation will be of type T, we can forcibly apply a cast and disable the warning. Note that the suppression applies only to the one line, and the explicit assignment of ann is used to allow this (an annotation can’t be attached to a return statement). We also add a comment to explain the reasoning behind adding this annotation. This should be used sparingly and where possible generics should be used properly. In some cases, we don’t even need to convert the collection; retrieving the size can be achieved simply from a List<?> or similar.

My thanks to Joshua Bloch and his ‘Effective Java’ book for finally explaining some of the solutions documented here. I didn’t realise until reading this that annotations could be applied to such a narrow scope or that it was safe to cast to a wildcard type from a raw type. This has enabled me to clean up a lot of the Classpath code.

Congratulations also to the IcedTea team, especially the OpenJDK Debian Team, for getting openjdk-6 into sid!

And finally, congratulations to Mark and Petri on the birth of their son, Jonas :)

It seems a lot of projects and distributions are seeing new releases either now or in the very near future. This week, we had a very quiet minor release of GJDoc, the GNU Classpath equivalent to javadoc. 0.7.9 includes a few changes that were previously only available in CVS, but the main one is a small fix that allows Classpath 0.97.1 documentation to be built. Our minor .1 release for 0.97 fixed a bug where the JSR166 code was not being included in the documentation build. With this fixed, it turns out gjdoc would no longer build the documentation as java.util.concurrent.TimeUnit is a rather complicated enumeration that our hacks can’t bypass. Michael Koch, in packaging GJDoc for Debian, was kind enough to point out that having the current release of GJDoc not being able to build documentation for the current release of Classpath was a bad thing. A quick release fixed this by pushing out the fix I made for this issue back in March. Of course, you can now use javadoc for IcedTea/OpenJDK to build the documentation instead; with another Free JDK about, there’s no need to just rely on GJDoc.

I do wonder what the long term future for GJDoc should be. It only works with GNU Classpath at present through a nasty bunch of hacks which cause the parser to skip chunks of the input. It really needs a major cleanup and to be made to work properly with 1.5 code. Thomas Fitzsimmons suggested we should merge it into the GNU Classpath codebase which seems a good idea, as it means we don’t run into this same revision hole we just did. However, it is worth maintaining GJDoc at all? For me, the main features it has over the OpenJDK javadoc are in speed and the look of the output. A key feature is also that it it plays nicer with Free Software i.e. it includes an option to include the source code with syntax highlighting. You can see the output for Classpath 0.97 online.

JikesRVM is also stepping up for a new release, 2.9.3, and this will be the first to showcase the new Classpath support for a non-copying unsynchronised StringBuilder. This is designed for local method usage where the builder will be converted to an immutable String object rather than leaving the method. As a result, I’ve been rushing to get it in a releasable state, as I know there’s a nasty bug lurking in the older patches JikesRVM has been using recently. I managed to do this today after we fixed a build issue. It seems the javah in OpenJDK6 outputs differently named header files to those JikesRVM implicitly depends on. We fixed this by making this dependency explicit as it should be, but perhaps this also uncovered an OpenJDK6 bug. I’m not sure where we should be filing these yet, so I just posted to jdk6-dev.

It’s also nice to hear that Ubuntu has just shipped with IcedTea6 included. Fedora 9 will also ship early next month (May 13th) with similar support and an OpenSUSE build is in the works. It’s nice to see Java support making it into the mainstream, thanks to Sun’s recent moves to make their JDK Free Software. On the less positive side, it seems that Gentoo won’t see support for IcedTea6 anytime soon. The Java Gentoo developers seem to be on a strange mission to support only the proprietary Java solutions (pretty much an inverse of what Fedora, Ubuntu and Debian do). In porting my IcedTea6 ebuild from the Libre Java overlay to their own overlays, they seem to have decided to drop support for GCJ… I’m not even going to go into how dumb this action is, as I could be here a while. Suffice to say, I don’t see how IcedTea6 can be bootstrapped without GCJ, let alone how they expect to then build it on architectures like PPC, PPC64 and ARM, as we’ve seen happen on the OpenJDK distro mailing list. It seems a very odd move for a distribution supposedly built on compiling things from source…

As those who’ve had the unfortunate luck of being in our IRC channel over the past few days will know, I’ve been trying to build IcedTea6 on our Debian etch server again. The good news is it looks like I may have succeeded this time, with some help from Christian Thalinger (twisti) and Gary Benson (gbenson).

Basically most of the tools that come with etch will be useless in your task. The versions of Kaffe, GCJ and GNU Classpath+JamVM that come with it don’t support the 1.5 language extensions. To add insult to injury, because the gcj backport used by Fedora is ’4.1′, IcedTea won’t scream at you for a default ./configure; make. Instead, the OpenJDK sanity check will tell you that you don’t have a 1.6 VM to bootstrap with (although you only actually need a 1.5 one…). This is because IcedTea will symlink bootstrap/jdk1.6.0/bin/java to the gij that is part of etch, which reports itself as 1.4. Similarly, the jar file it picks up is a pre-generics version of Classpath that can’t be used even with a VM that claims to be 1.5.

So you need to build your own Classpath Java stack before proceeding with trying out IcedTea6. This machine is used to do my regression testing for Classpath so I already had an install of 0.97.1, but it’s simple enough to get a copy of GNU Classpath and a compatible VM if you don’t. You CAN use what is installed with etch to build this. The simplest way to do that is to install the java-gcj-compat-dev package which should give you a native version of ecj.

Using this, you can download Classpath 0.97 or any prior version up to 0.95 (the first to include the code merged from the generics branch). ./configure; make; make install should work, but you probably want to use --prefix to install it in a local directory rather than /usr/local/classpath. --disable-plugin is something I also usually add to avoid having to had the Mozilla headers.

Once installed, you need to add a 1.5 VM that works with this. CACAO (from the Mercurial repository), JamVM or the new Kaffe release should work. I used CACAO. You can then configure IcedTea6 by pointing it at the new install. Assuming $CLASSPATH is where you installed everything:

./configure --with-java=$CLASSPATH/bin/java --with-libgcj-jar=$CLASSPATH/share/classpath/ --with-jar=$CLASSPATH/bin/gjar --with-rmic=$CLASSPATH/bin/grmic --with-javah=$CLASSPATH/bin/gjavah

Before building, you also need to patch glibc (thanks to twisti for this tip). Don’t worry! It’s not too scary, just a simple change to a shell script, to make it the same as in more recent versions. On x86, the change is in /usr/lib/ The last line should read:

GROUP ( /lib/ /usr/lib/libc_nonshared.a AS_NEEDED (/lib/ ) )

If you’re on x86_64 or ppc64 (or any 64-bit platform i guess), you need to patch /usr/lib64/

GROUP ( /lib/ /usr/lib/libc_nonshared.a AS_NEEDED ( /lib/ )

That done, you should be able to run make and hopefully reach the message that IcedTea is served. YMMV of course, so let me know how you get on. This was quite a bit easier than when I last remember trying, so hats off to the IcedTea folks for all their hard work. Next stop, Solaris… ;)