Sunday, December 30, 2012

Lambdas: Coming to JDK8 Soon!

It's exciting to see lambdas finally making it into Java after so long! I was excited to learn about it, so I whipped up some sample code and am doing a presentation on JDK8 Lambdas for some other developers.

One caveat: As I write this, the lambda API is still under development, and it's tricky to learn an API and write samples that work when the API changes from day to day. When I migrated from JDK8 dev build 64 to build 69, classes had been renamed and methods had moved or disappeared. The presentation here uses lambda build 69, and the code samples in the presentation are fully functional but will undoubtedly be out of date when the lambda API is finalized.

Be that as it may, it's exciting to learn a completely new way of doing things, especially in Java. You can check out my presentation for all the details, but the critical resources might be nice to have available here:

Sunday, December 23, 2012

Uploading Binary Files the Fun Way

Here’s the problem: suppose you have a domain object (say, an item) which can be created restfully by posting json data in a request body. Spring can automagically convert request bodies in json to Java objects on the server, which is incredibly convenient. It’s also an incredible pain in the butt if you have a new requirement to upload the contents of a binary file as part of that json object (say, an item image to be stored on the server and displayed later).

Fortunately, there’s a fun way to include binary file content in your json object!

Here are the Java objects we’ll be dealing with. Note that the Image class has a byte[]: this will be coming from an image file from the browser, stored in the DB, and streamed back as needed upon request.

@Entity
public class Item {

  @OneToOne
  private Image image = null;

  ...
}

@Entity
public class Image {

  @Lob
  private byte[] bytes = null;

  @Basic
  private String contentType = null;

  ...
}


Next, on the browser we can make use of the file API to load a file, then encode its contents using Base64 encoding, and finally assign the encoded contents to our javascript object before posting it.

function put() {

  // get a file object, we only need one
  var file = $('input[type="file"]').get(0).files[0];

  // obtain the object you want to post as a json object
  // in the body of a POST
  var item = ...

  if(file) {

  var reader = new FileReader();

  // after the file has loaded on the client,
  // convert to base64 and assign to the item before POSTing
reader.onload = function loaded(evt) {
       item.image={};
       item.image.bytes = arrayBufferToBase64(evt.target.result);
       item.image.contentType = file.type;

       // post json object as the request body
       // with the technique of your choice
       // I like using $.ajax(...)
       create(item);
  };

  reader.readAsArrayBuffer(file);
  }
  else {
  create(item);
  }
}

// pass in an HTML5 ArrayBuffer, returns a base64 encoded string
function arrayBufferToBase64( arrayBuffer ) {
  var bytes = new Uint8Array( arrayBuffer );
  var len = bytes.byteLength;
  var binary = '';
  for (var i = 0; i < len; i++) {
  binary += String.fromCharCode( bytes[ i ] );
  }
  return window.btoa( binary );
}


Now on the server, we have a Spring Controller which will handle the request. As part of the json unmarshalling that converts the the request body to an Item, the base64-encoded value in image.item.bytes will be converted to a standard Java byte[], and stored as a LOB in the database.

@RequestMapping(value = BASE_URL + "/item", method = RequestMethod.POST)
public @ResponseBody long createItem(@RequestBody Item incoming)
{
  // authorization and input scrubbing removed for brevity

  service.createItem(incomingItem); // sets the itemId
  long newItemId = incoming.getId();
  return newItemId;
}


To retrieve the image, another Spring Controller method can provide the image by streaming the bytes directly.

@RequestMapping(value=BASE_URL + "/item/{id}/image", method=RequestMethod.GET)
public void getItemImage(@PathVariable Long id, HttpServletResponse response) throws IOException
{
  Item item = service.loadItem(id);
  response.setContentType(item.getImage().getContentType());
  response.getOutputStream().write(item.getImage().getBytes());

  // don't close the output stream from the response
}


Now if you need to retrieve the image, you can simply reference the item’s id in the appropriate URL in an image tag: <img src="... /item/image/1/image" />

Beautiful!

Sunday, December 16, 2012

Unit Tests and Transactions

A question came up on StackOverflow recently that got me to thinking about how to unit test transaction-dependent behavior.

Generally we start with data access tests that are transactional so that each test can roll back when it's complete. This way each test can modify an in-memory database without affecting subsequent tests. 

Let's say we are tasked with writing code for a common workflow in our application that loads, updates, and saves an entity. We might write a test like this that hits a live in-memory database:

// everything is in one transaction
@Transactional
public void test() {
  String newValue = "somethingnew";
  Entity t1 = dao.getById(id);
  t1.setProperty(newValue);
  dao.update(t1);
  Entity t2 = dao.getById(id);
  assertEquals(newValuet2.getProperty());
}

We then write code to make the test pass, and move on believing that All Is Good. But all is Not Quite Good.

Imagine that something is wrong with our dao.update() method and it actually does not update the database.
  1. In that case, the object t1 is stored in the current database session
  2. Retrieving t2 by id actually returns t1 by virtue of being called in the same transaction
  3. The test compares t1 with its own property, instead of comparing the new value with the persisted value
  4. The test passes but should have failed.
One way around this is to remove the transactional boundaries from the test method and allow the methods we're testing to begin/commit their own transactions (which they will normally be doing anyway). This way if the update method does not update the entity, the bug will quickly be exposed and All Is Good again.

// test method is not @Transactional
// DAO methods are themselves @Transactional
public void test() {
  String newValue = "somethingnew";
  Entity t1 = dao.getById(id);    // transaction 1
  t1.setProperty(newValue);
  dao.update(t1);                 // transaction 2
  Entity t2 = dao.getById(id);    // transaction 3
  assertEquals(newValuet2.getProperty());
}

One thing to be aware of with the second test is that if we are committing against an in-memory database like HSQLDB, our changes made in one test will not be automatically rolled back, and will be visible to subsequent tests. We will have to manually undo all of the changes at the end of the test, or ensure that the test data does not interfere with other tests which is not so easy.

In summary: when would we use one kind of test vs the other? In general I think the benefits outweigh the risks for rolling back after each test, so that is usually my starting point. Testing a workflow that crosses multiple transactions is really a job for an integration test and can easily be done with Selenium or SoapUI. So I would favor a full integration test to test that kind of behavior, and use a multi-transactional unit test to experiment with a new sequence of calls or debug a certain scenario quickly and without the overhead of a full integration test. 

What do you think? Do you favor multiple or single transaction tests? Have you seen the transaction test bug described here?