Thursday, October 27, 2011

Replacing javac with eclipse compiler in Maven

I was working on a Java project with eclipse where I used cyclic dependencies. Specifically I implemented the Reverse MVP pattern with GWT Platform. Everything went well in as long as I was using eclipse to compile the project, but once I tried to use Maven to compile the project, I got compilation errors for every case where I had a cyclic dependency. I figured that if eclipse is good enough to compile the sources in development time, it might as well be used in build time instead of JDK's javac. Here is the maven-compiler-plugin configuration from the project POM I initially had:

   <groupId>org.apache.maven.plugins</groupId>
   <artifactId>maven-compiler-plugin</artifactId>
   2.3.2
   
      <source>1.6</source>
      1.6
   
In order to replace javac with eclipse compiler we need to do two things: we need to add a dependency for plexus-compiler-eclipse, and we need to tell the maven-compiler-plugin to use the eclipse compiler as described here. Here is the updated configuration:

   <groupId>org.apache.maven.plugins</groupId>
   <artifactId>maven-compiler-plugin</artifactId>
   2.3.2
   
      <compilerId>eclipse</compilerId>
      <source>1.6</source>
      1.6
   
   
      
         <groupId>org.codehaus.plexus</groupId>
         <artifactId>plexus-compiler-eclipse</artifactId>
         1.8.2
      
   
After that it was possible to build the project with Maven.

Thursday, September 22, 2011

Spring Social Meets Google APIs

For the past several weeks I have been working on the Spring Social Google project. The goal of Spring Social is to act as an abstraction layer between your application code and various social APIs, and removes the need for you to deal with authentication and HTTP-Java mapping, and now you can use it with a growing number of Google APIs, starting with the Contacts API and Google+ API.

Why do I need this? Google already provides Java libraries for its APIs.

Indeed you can use the Java libraries from Google, but there are cases where you may benefit from using Spring Social instead. For once, Spring Social comes with an authentication mechanism that removes the need for you to write Servlet code for the stages of the OAuth2 process. You can read about it here. Another goodie Spring Social provides is connection management and persistence, so you don't have to deal with session management and and writing your own database schema to store the users' access tokens. You can read about it here. You can see how to set up both mechanisms in the example application, specifically in SocialConfig.javaWebMvcConfig.java and spring-config.xml .

Spring Social also takes a different approach than Google's Java libraries in that Google's libraries provide a Java API that mirrors the underlying REST API (be it Atom, Portable Contacts or other), while Spring Social aims to provide you with data structure and operations you are likely to use.
Let's use both tools to write a program that fetches the user's contacts and prints their names and primary e-mail addresses.

Here is how it's done with Google Contact API Java library:

ContactsService service = getContactsService();
URL feedUrl = new URL("https://www.google.com/m8/feeds/contacts/default/full");
ContactFeed resultFeed = service.getFeed(feedUrl, ContactFeed.class);
for (ContactEntry entry : resultFeed.getEntries()) {
   String fullName = entry.getName().getFullName().getValue();
   String email = null;
   for(Email e : entry.getEmailAddresses() {
      if(email.getPrimary) {
         email = e;
         break;
      }
   }
   System.out.println(fullName + " " + email);
}

And here is the equivalent Spring Social Google code:

GoogleOperations google = getGoogleOperations();
List<Contact> contacts = google.contactOperations().getContactList();
for(Contact contact : contacts) {
   String fullName = contact.getFullName();
   String email = contact.getPrimaryEmail();
   System.out.println(fullName + " " + email);
}

It's up to you to decide if you prefer the simplicity of Spring Social or the flexibility of the Google API Java libraries.

Is the project stable? What is its roadmap?

Spring Social Google is still at an early stage, so there are no builds and the API may change. Aside of the currently implemented Contacts and Google+ APIs, the project is expected to include more and more of Google APIs over time. You are welcome to download the sources and play with the example application, which is also deployed on Cloud Foundry

Enjoy and send feedback!

Saturday, July 30, 2011

Dynamically Adding Styles to HTML Pages

I came into a situation where I needed to apply styles to the entire page after the page was loaded. Using techniques like targeting elements by their CSS selectors and applying styles to them with jQuery or similar tools would not be sufficient, because I would have to do it every time elements are added to the DOM and it would have required specifically adding a class (or some other attribute) to each element I wanted to apply the style to.

The solution I ended up implementing was to add styles to the page dynamically using JavaScript. Normal browsers let you manipulate the Head element as any other element, so all I needed to do was to append the Style element as a child of the Head element, and set the style's inner HTML with the CSS content:

function addCustomStyle() {
   var style = document.createElement("style");
   style.innerHTML = ".someclass {color: green;}";
   var head = document.getElementsByTagName("head")[0];
   head.appendChild(style);
}

Unsurprisingly like so many other cases, Internet Explorer has its own way of doing stuff. It doesn't allow manipulating the Head element as other elements, but it does provide an API for adding CSS rules to the document. Here is how it's done:

function addCustomStyle() {
   var style = document.createStyleSheet();
   style.addRule(".someclass", "color: green;");
}

You could wrap the two methods with IE conditional tags ([if !IE] and [if IE]) so only one of them will actually be used depending on the browser, or you can come up with your own mechanism. If you are using GWT, deferred binding is perfect for this.

Thursday, January 20, 2011

Better Enum Mapping with Hibernate

JPA and Hibernate provides two ways to map enum fields to database fields: either map the enum value ordinal with @Enumerated(EnumType.ORDINAL) or the enum value name with @Enumerated(EnumType.STRING) . Both cases are not ideal, because it's very easy to make changes to the enum, such as changing values order or renaming values, and forgetting to migrate the existing values in the database accordingly. Actually the need to migrate existing data is not justified if you just want to change Java code.

Here is a mechanism that lets you decouple the enum value names and ordinals from the data, while still making it easy to map enum fields to database columns.

The solution can be specific to an enum type, but it would be better to have all your persistent enum types implement the same interface to reuse the mechanism. Let's define the PersistentEnum interface:

public interface PersistentEnum {
    int getId();
}

and it will be implemented by enum types, for example:

public enum Gender implements PersistentEnum {

    MALE(0),
    FEMALE(1);

    private final int id;

    Gender(int id) {
        this.id = id
    }

    @Override
    public int getId() {
        return id;
    }

}

Now we need to define the Hibernate User Type that will do the conversion both ways. Each enum requires its own user type, so first we will code an abstract superclass that works with the PersistentEnum interface and then subclass it for each enum:

public abstract class PersistentEnumUserType<T extends PersistentEnum> implements UserType {

    @Override
    public Object assemble(Serializable cached, Object owner)
            throws HibernateException {
        return cached;
    }

    @Override
    public Object deepCopy(Object value) throws HibernateException {
        return value;
    }

    @Override
    public Serializable disassemble(Object value) throws HibernateException {
        return (Serializable)value;
    }

    @Override
    public boolean equals(Object x, Object y) throws HibernateException {
        return x == y;
    }

    @Override
    public int hashCode(Object x) throws HibernateException {
        return x == null ? 0 : x.hashCode();
    }

    @Override
    public boolean isMutable() {
        return false;
    }

    @Override
    public Object nullSafeGet(ResultSet rs, String[] names, Object owner)
            throws HibernateException, SQLException {
        int id = rs.getInt(names[0]);
        if(rs.wasNull()) {
            return null;
        }
        for(PersistentEnum value : returnedClass().getEnumConstants()) {
            if(id == value.getId()) {
                return value;
            }
        }
        throw new IllegalStateException("Unknown " + returnedClass().getSimpleName() + " id");
    }

    @Override
    public void nullSafeSet(PreparedStatement st, Object value, int index)
            throws HibernateException, SQLException {
        if (value == null) {
            st.setNull(index, Types.INTEGER);
        } else {
            st.setInt(index, ((PersistentEnum)value).getId());
        }
    }

    @Override
    public Object replace(Object original, Object target, Object owner)
            throws HibernateException {
        return original;
    }

    @Override
    public abstract Class<T> returnedClass();

    @Override
    public int[] sqlTypes() {
        return new int[]{Types.INTEGER};
    }

}

The interesting methods are nullSafeGet() which is called when the resultset from the database is mapped to an object, and nullSafeSet() which is called when the fields of an object are mapped to SQL parameters of insert/update/delete statements.

The extension point is the abstract method returnedClass() - in every subclass we will override it so it returns the specific enum class. A User Type for the Gender enum defined above would like this:

public class GenderUserType extends PersistentEnumUserType<Gender> {

    @Override
    public Class<Gender> returnedClass() {
        return Gender.class;
    }

}

The last thing to do is to configure fields of enum types to use the appropriate user types:

@Entity
public class Person {

    @Type(type="it.recompile.GenderUserType")
    private Gender gender;

    // other fields and methods...
}

Note that the type attribute should contain the fully qualified class of the user type, including the package.

That's it - now you can safely make changes to your enum classes without worrying about problems with existing data, as long as you don't change the id values in the constructors (which you have no reason to do).

Wednesday, January 19, 2011

Simplifying JPA Code with Hades

JPA is not that much fun anymore. It was a great improvement compared to plain JDBC and XML mapping of many persistence frameworks, but after a while you come to realize that in many cases you need to write quite a lot of code to do very little work (like many other things in Java).

Enters Hades. In one sentence, it reduces the amount of JPA code by using convention over configuration. The Hades documentation is clear and comprehensive, so I won't explain it in detail here. I will demonstrate the process of migrating a simple JPA project to use Hades, something you will probably want to do once you get to know Hades.

Setting Up a JPA Project

In this example we use Hibernate as the persistence provider and JUnit for the unit tests. If you want to use Maven, you just need to add the following dependencies to your POM:

<dependency>
   <groupId>org.hibernate</groupId>
   <artifactId>hibernate-entitymanager</artifactId>
   <version>3.6.0.Final</version>
</dependency> 
<dependency>
   <groupId>junit</groupId>
   <artifactId>junit</artifactId>
   <version>4.8.2</version>
   <scope>test</scope>
</dependency> 

Let's create a very simple persistent model class named User:

@Entity
public class User {
    
    @Id
    @GeneratedValue(strategy=IDENTITY)
    private Long id;
  
    private String username;
    
    private String fullName;
        
    private User() {}
    
    public User(String username, String fullName) {
        this.username = username;
        this.fullName = fullName;
    }
    
    public Long getId() {
        return id;
    }

    public String getUsername() {
        return username;
    }

    public String getFullName() {
        return fullName;
    }
    
}  

A simple DAO interface:

public interface UserDao {

    void saveUser(User user);
    User findById(Long id);
    List<User> findByFullName(String fullName);
    void deleteUser(User user);
}

And of course there is a DAO implementation class, but we will not show it here, soon you will see why.

No project is complete without unit tests, so here is a very simple one:

public class UserDaoTest {

    @Test
    public void generalTest() {

        EntityManagerFactory emf = Persistence.createEntityManagerFactory("pu");
        EntityManager em = emf.createEntityManager();
        UserDao userDao = new UserDaoImpl(em);
        
        final String username = "jsmith";
        final String fullName = "John Smith";
        final Long id;
        
        // Save user
        User user = new User(username, fullName);
        userDao.saveUser(user);
        id = user.getId();
        
        // Find by ID
        user = userDao.findById(id);
        assertEquals(username, user.getUsername());
        
        // Find by Full Name
        List<User> results = userDao.findByFullName("John Smith");
        assertEquals(1, results.size());
        assertEquals(username, results.get(0).getUsername());
        
        // Delete
        userDao.deleteUser(user);
        user = userDao.findById(id);
        assertNull(user);
    }
}

Regarding implementation details: the findByFullName() method can be implemented either with a named query or with a criteria, it doesn't matter for our example.

Adding Hades

Add Hades to your project. With Maven we add this dependency:

<dependency>
   <groupId>org.synyx.hades</groupId>
   <artifactId>org.synyx.hades</artifactId>
   <version>2.0.1.RELEASE</version>
</dependency> 

To make our DAO work with Hades, we make UserDao inherit from GenericDao<User, Long>. This is a Hades interface where the first type is the persistent class it will work with and the second type is the type of the persistent class identifier.
Now we simply delete stuff. That's right - we delete the DAO implementation class, the named query (if you have one) and some of the DAO interface methods.

The UserDao interface should now look like this:

public interface UserDao extends GenericDao<User, Long> {

    List<User> findByFullName(String fullName);
}

But what about all the methods that we removed? They are no longer needed because GenericDao has equivalent methods, and Hades provides the implementation. Take a look at the JavaDoc of GenericDao to see what you get "for free".

That's great, but doesn't findByFullName() need an implementation? That depends. For simple finder methods you can use method names that matches Hades conventions. In this case, the prefix "findBy" that precedes "FullName" means that this is a search query that uses the "fullName" property in the "where" clause, meaning Hades generates a this query from the method name: "from User where fullName = ?" .

If we do want to write more complex queries, we can do it by annotating the method in the DAO interface. Let's change findByFullName() to use the "like" operator instead of "=" :

public interface UserDao extends GenericDao<User, Long> {

    @Query("from User where fullName like ?1")
    List<User> findByFullName(String fullName); 

}
This makes much more sense (at least to me) than writing the query as part of the model code.

So, if there is no implementation class (more correctly - Hades generates the implementation class at runtime), how do we instantiate the DAO ? We use Hades GenericDaoFactory. Let's change the unit test code accordingly:
public class UserDaoTest {

    @Test
    public void generalTest() {
        
        EntityManagerFactory emf = Persistence.createEntityManagerFactory("pu");
        EntityManager em = emf.createEntityManager();
        
        GenericDaoFactory gdf = GenericDaoFactory.create(em);
        UserDao userDao = gdf.getDao(UserDao.class);
        
        // rest of the code as before
    }
}
That's it. By using Hades we made our code much more clean because:
  • We get the basic CRUD operations in every DAO out of the box
  • We don't need to write DAO implementation class in many cases
  • We write JPQL queries in the DAO interface and not in the model
  • We don't even need to write JPQL queries for simple cases where we can use conventional method names
Although Hades removes the need to write DAO implementation code, sometimes we have no choice. Maybe we want to build a dynamic query or we simple want to migrate gradually to Hades while keeping existing JPA code. For this case we can write an additional custom DAO interface with an implementation, and make the Hades DAO inherit the custom DAO:

public interface UserDaoCustom {

    List<User> searchUsers(String username, String fullName);
}

public class UserDaoImpl implements UserDaoCustom {

    List<User> searchUsers(String username, String fullName) {
        // implementation code...
    }
}

public interface UserDao extends GenericDao<User, Long>, UserDaoCustom {

    @Query("from User where fullName like ?1")
    List<User> findByFullName(String fullName); 
} 

When we use a custom DAO with implementation, we need to tell the Hades factory to use this implementation, so we modify one line the unit test code:

UserDao userDao = gdf.getDao(UserDao.class, UserDaoImpl.class);

So there - you don't have to make any drastic changes to your codebase at once in order to benefit from Hades.

There are plenty of nice features in Hades we didn't cover such as pagination, sorting, named parameters and Spring integration. I encourage you to read the Hades Documentation to learn about its features. We will cover some of them in future posts.