Categories
Coding

Buggy MemoryMapping in the JDK

I just ran into a problem using memory-mapped byte buffers on Java 5. The basic use case is read from a socket, write to a file, and then map the resulting file into memory to perform digest calculations, etc. across the entire file. It works the first time, but any subsequent attempt to rewrite the file will fail with the message The requested operation cannot be performed on a file with a user-mapped section open.

After some searching, I found that this is a relatively common problem, and it is a facet of the way that memory mapping actually works on the underlying OS. It seems you can have speed, or safety, but not both. The issue is with the current impossibility of producing a reliable and platform-independent unmap command. An example of unmap() at the C level can be seen here.

In fairness to Sun, it’s probably not fair to blame them for the existance of this problem – it seems to be currently intractable. The end of their evaluation note in the Bug Parade entry for this bug reads:

We at Sun have given this problem a lot of thought, both during the original
development of NIO and in the time since. We have yet to come up with a way to
implement an unmap() method that’s safe, efficient, and plausibly portable
across operating systems. We’ve explored several other alternatives aside from
the two described above, but all of them were even more problematic. We’d be
thrilled if someone could come up with a workable solution, so we’ll leave this
bug open in the hope that it will attract attention from someone more clever
than we are.

Just for reference, the code that creates the buffer is shown below:


 FileChannel fc = new FileInputStream(requestedFile).getChannel();
       int sz = (int)fc.size();
       
       MappedByteBuffer bb = fc.map(FileChannel.MapMode.READ_ONLY, 0, sz);
       
       byte buffer[] new byte[sz];
       bb.get(buffer, 0, sz);
Categories
Coding

ClassCastExceptions and Hibernate mappings

I just had an issue where Hibernate was throwing ClassCastExceptions when trying to persist an entity to the database. I tracked the problem down a specific property – a byte[]. I had just switched my entity mappings from a hbm.xml file to an annotations-based approach. The solution was fairly straightforward, as it turned out – just specify the mapping type explicitly as “binary”. It seems that Hibernate may be attempting to map it as a blob type, which doesn’t map transparently to primitive byte arrays (yet – I see there is a PrimitiveByteArrayBlobType in the Hibernate Annotations API). Meanwhile, I just declare the mapping like so:




  @Column
  @Type(type="binary")
  public byte[] getRawData() {
    return rawData;
  }

Categories
Coding

Eclipse 3.1 M6 and Generics

The Eclipse compiler seems to choke on the following code, which is some generics-related issue. Consider the following:

private Set<ActionState> stateHistory = new TreeSet<ActionState>();

Now, I know a TreeSet will be stored in natural ascending order, but just consider the case where you might try to do this:

@Transient
public ActionState getCurrentState() {
if (stateHistory == null || stateHistory.size() == 0) {
return null;
}
return Collections.max(stateHistory);
}

Eclipse chokes with the error:


Bound mismatch: The generic method max(Collection< ? extends T>) of type Collections is not
applicable for the arguments (Collection< ? extends ActionState>) since the type ActionState is not
a valid substitute for the bounded parameter >

This is confusing, as ActionState implements Comparable.

If I change the offending line to read:

return ((TreeSet<ActionState>)stateHistory).last();

It works fine. It looks like it may be a bug in Eclipse’s compiler, as colleagues using IntelliJ IDEA have no such problems.