Thursday, November 13, 2008

Windows 2000 on Mac Parallels

Windows 2000 Boot Disks


We first need images of the 4 Windows 2000 Boot Disks. After we mount the images, we can use /Applications/Utilities/Disk Utility.app to turn them into *.fdd images for Parallels.
  • Mount the downloaded disk images by double-clicking on them.
  • Run the Disk Utility.app.
  • Select mounted disk n
  • Click on New Image.
  • Name the image disk_n.
  • Select read/write Image Format with no Encryption.
  • Change the name on the filesystem from disk_n.dmg to disk_n.fdd


Save disk images.


Start Parallels


We can start /Applications/Parallels/Parallels Desktop.app and select New... Custom OS Installation using Windows 2000. My personal preference is to increase the default memory size to at least 512MB and to use Bridged Ethernet.
When the virtual machine starts, the floppy drive is not automatically connected, so we get the error No boot device available, press Enter to continue.
No boot device available.

We need to stop the virtual machine, click Floppy Disk on the Configuration screen, select Connect at startup and browse to the disk1.fdd Image File that we created earlier.
Select Connect at startup.

When we restart the virtual machine, the Windows 2000 Setup should appear.
Windows 2000 setup.

We can follow the prompts and switch out the disks by clicking on the floppy in the lower right corner of the screen.
After the 4th disk has been read, then we can insert the Windows 2000 install CD into the CD ROM drive and it will automatically be connected by Parallels.

Install Parallels Tools


After the OS has been installed and we have logged in for the first time, the display is configured for VGA 16 colors with a maximum resolution of 800x600. This problem is easily solvable by choosing Actions - > Install Parallels Tools. These tools also allow drag-and-drop from Mac to Windows and easy browsing across the Mac and Windows file systems.
Install Parallels Tools.

With Open Solaris, OS X and now Windows running on the same piece of hardware without need to reboot to switch between them, it is now possible to eliminate the aging hardware gathering dust in my office.

Tuesday, November 4, 2008

Testing With Spring, Hibernate and JUnit4

This post builds off the functional code and test methods constructed in previous posts. While the techniques in here are not restricted to the features specific to those situations, it might be helpful to review those posts for more context about the original code, the problems that we solved and the patterns that emerged leading to these re-factorings.

Motivation


To use the AbstractTransactionalDataSourceSpringContextTests with JUnit 4 test cases.

The Problem


When our test cases extend AbstractTransactionalDataSourceSpringContextTests, the JUnit 3.8 test runner executes them, so we are not able to use some features available in JUnit 4.

Write a DAO Method and a JUnit 4 Test Case


We will first create a method in the DAO that would benefit from one of the features available in JUnit 4 but not JUnit 3.8, i.e., a verification that the method invocation throws a particular exception deterministically under specific conditions. For the example, the method can simply always throw the expected exception.


public void expectDataAccessException() {
getHibernateTemplate().execute(new HibernateCallback() {
public Object doInHibernate(final Session session) throws HibernateException, SQLException {
throw new SQLException("");
}
});
}



The test case will declare the expectation that the particular exception will be thrown when the method under test is invoked.


@Test(expected = DataAccessException.class)
public void testExpectDataAccessException() throws Exception {
getAuthorDAO().expectDataAccessException();
}



If we run this test now, the test case will fail with an UncategorizedSQLException, even though we have declared our expectation in the annotation.

Re-factor To Composition


The solution is to create a test delegate for Spring-specific configuration (e.g., the setup of the Spring configuration location and the injection of Spring-managed beans) and for Spring datasource transaction management. One of the more powerful features of this particular test base class is that transactions that occur in test cases are rolled back on tear-down, so we generally have no need to worry about test data corruption.


package spring.hibernate.oracle.stored.procedures.dao;

import org.springframework.test.AbstractTransactionalDataSourceSpringContextTests;

public class AbstractTransactionalDataSourceSpringContextTestsDelegate extends
AbstractTransactionalDataSourceSpringContextTests {

private AuthorDAO authorDAO;

public void setup() throws Exception {
super.setUp();
}

public void teardown() throws Exception {
super.tearDown();
}

@Override
protected String[] getConfigLocations() {
return new String[] { "applicationContext.xml" };
}

public AuthorDAO getAuthorDAO() {
return authorDAO;
}

public void setAuthorDAO(final AuthorDAO authorDAO) {
this.authorDAO = authorDAO;
}
}



Now we are ready to remove the delegate code from the test class (and remove the superclass dependency on the AbstractTransactionalDataSourceSpringContextTests that causes the test to execute with the JUnit 3.8 runner). Note that we must declare @Before and @After methods to setup and tear-down the Spring delegate. Also note that we have retained the test-case specific setup (i.e., the population of Authors). Our class consists only of data setup and tests.


package spring.hibernate.oracle.stored.procedures.dao;

import static org.junit.Assert.assertEquals;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.springframework.dao.DataAccessException;
import spring.hibernate.oracle.stored.procedures.domain.Author;

public class AuthorDAOTest {

private final AbstractTransactionalDataSourceSpringContextTestsDelegate delegate;

public AuthorDAOTest() {
delegate = new AbstractTransactionalDataSourceSpringContextTestsDelegate();
}

@Before
public void setUp() throws Exception {
delegate.setup();
createAuthor(1, "Jules", "Verne");
createAuthor(2, "Charles", "Dickens");
createAuthor(3, "Emily", "Dickinson");
createAuthor(4, "Henry", "James");
createAuthor(5, "William", "James");
createAuthor(6, "Henry", "Thoreau");
}

@After
public void tearDown() throws Exception {
delegate.teardown();
}

@Test
public void testFindByLastNameUsingHQL() throws Exception {
assertEquals(2, delegate.getAuthorDAO().findByLastNameUsingHQL("James").size());
assertEquals(1, delegate.getAuthorDAO().findByLastNameUsingHQL("Verne").size());
assertEquals(1, delegate.getAuthorDAO().findByLastNameUsingHQL("Dickinson").size());
assertEquals(1, delegate.getAuthorDAO().findByLastNameUsingHQL("Dickens").size());
assertEquals(0, delegate.getAuthorDAO().findByLastNameUsingHQL("Whitman").size());
}

@Test
public void testFindByLastNameUsingStoredProcedure() throws Exception {
assertEquals(2, delegate.getAuthorDAO().findByLastNameUsingStoredProcedure("James").size());
assertEquals(1, delegate.getAuthorDAO().findByLastNameUsingStoredProcedure("Verne").size());
assertEquals(1, delegate.getAuthorDAO().findByLastNameUsingStoredProcedure("Dickinson").size());
assertEquals(1, delegate.getAuthorDAO().findByLastNameUsingStoredProcedure("Dickens").size());
assertEquals(0, delegate.getAuthorDAO().findByLastNameUsingStoredProcedure("Whitman").size());
}

@Test
public void testFindByFirstNameUsingFunction() throws Exception {
assertEquals(0, delegate.getAuthorDAO().findByFirstNameUsingFunction("James").size());
assertEquals(2, delegate.getAuthorDAO().findByFirstNameUsingFunction("Henry").size());
}

@Test
public void testUpdate() throws Exception {
final Author author = delegate.getAuthorDAO().findByLastNameUsingHQL("Thoreau").get(0);
author.setLastName("Miller");
delegate.getAuthorDAO().update(author);
assertEquals(1, delegate.getAuthorDAO().findByLastNameUsingHQL("Miller").size());
}

@Test(expected = DataAccessException.class)
public void testExpectDataAccessException() throws Exception {
delegate.getAuthorDAO().expectDataAccessException();
}

private void createAuthor(final int id, final String firstName, final String lastName) {
delegate.getJdbcTemplate().execute(
String.format("insert into author (id, first_name, last_name) values (%d, '%s', '%s')", id, firstName, lastName));
}
}



Observe Lifecycle Events


In order to listen for transaction-specific lifecycle events, we will create an Observer, effectively re-factoring method override to method observation.


package spring.hibernate.oracle.stored.procedures.dao;

public interface ITransactionListener {

void onSetUpBeforeTransaction() throws Exception;

void onSetUpInTransaction() throws Exception;

void onTearDownInTransaction() throws Exception;

void onTearDownAfterTransaction() throws Exception;
}



In the delegate, we now need to broadcast the transaction lifecycle events.


....
private final Collection transactionListeners;

public AbstractTransactionalDataSourceSpringContextTestsDelegate() {
transactionListeners = new ArrayList();
}
....
public void registerTransactionListener(final ITransactionListener listener) {
transactionListeners.add(listener);
}

@Override
protected void onSetUpBeforeTransaction() throws Exception {
super.onSetUpBeforeTransaction();
for (final ITransactionListener listener : transactionListeners)
listener.onSetUpBeforeTransaction();
}

@Override
protected void onSetUpInTransaction() throws Exception {
super.onSetUpInTransaction();
for (final ITransactionListener listener : transactionListeners)
listener.onSetUpInTransaction();
}

@Override
protected void onTearDownInTransaction() throws Exception {
try {
for (final ITransactionListener listener : transactionListeners)
listener.onTearDownInTransaction();
} finally {
super.onTearDownInTransaction();
}
}

@Override
protected void onTearDownAfterTransaction() throws Exception {
try {
for (final ITransactionListener listener : transactionListeners)
listener.onTearDownAfterTransaction();
} finally {
super.onTearDownAfterTransaction();
}
}



We can easily create a tracing transaction listener to demonstrate how this lifecycle observation works.


package spring.hibernate.oracle.stored.procedures.dao;

public class TracingTransactionListener implements ITransactionListener {

public void onSetUpBeforeTransaction() throws Exception {
System.out.println("TracingTransactionListener.onSetUpBeforeTransaction()");
}

public void onSetUpInTransaction() throws Exception {
System.out.println("TracingTransactionListener.onSetUpInTransaction()");
}

public void onTearDownAfterTransaction() throws Exception {
System.out.println("TracingTransactionListener.onTearDownAfterTransaction()");
}

public void onTearDownInTransaction() throws Exception {
System.out.println("TracingTransactionListener.onTearDownInTransaction()");
}
}



We will add this listener to the delegate on test setup. Note that the listener must be registered before the delegate itself is setup.


....
@Before
public void setUp() throws Exception {
delegate.registerTransactionListener(new TracingTransactionListener());
delegate.setup();
....
}



If we run the this test class now, we should note console output that includes these statements:
  • TracingTransactionListener.onSetUpBeforeTransaction()
  • TracingTransactionListener.onSetUpInTransaction()
  • TracingTransactionListener.onTearDownInTransaction()
  • TracingTransactionListener.onTearDownAfterTransaction()

Monday, November 3, 2008

Non-ANSI Oracle ADD_MONTHS Function

Motivation


To use an Oracle function for adding months with the following characteristics:
  • When the resulting month has as many or fewer days than the initial month, and when the initial day of the month is greater than the number of days in the resulting month, then the resulting day should fall on the last day of the resulting month (this is how add_months already works).
  • When the resulting month has more days than the initial month, and when the initial day is the last day of the initial month, then the resulting day of the resulting month should be the same as the initial day (this is not how add_months works).


The Problem


As I had been using the Oracle add_months function for date calculations, I started noticing an unexpected and unintuitive result when a new date is calculated on the last day of certain months. For example,


SELECT add_months(to_date('2009-02-28','YYYY-MM-DD'), 1) FROM dual;

ADD_MONTH
---------
31-MAR-09



I would have expected the resulting date to be 28-MAR-09.

Of course, in the case where the initial month contains more days than the resulting month, I get the results that I expect.


SELECT add_months(to_date('2009-01-31','YYYY-MM-DD'), 1) FROM dual;

ADD_MONTH
---------
28-FEB-09



This feature appears to be part of the ANSI definition for interval math, but this result does not seem particularly intuitive to me.

Unfortunately, the numtoyminterval function only gives the result we expect when go from a month with fewer days to a month with more days, but when going from a month with more days to fewer, it raises an exception when calculating from the last day of the month (or from any day of the month that is greater than the number of days in the resulting month).


SELECT to_date('2009-02-28','YYYY-MM-DD') + numtoyminterval(1, 'month') FROM dual;

TO_DATE('
---------
28-MAR-09

SELECT to_date('2009-01-31','YYYY-MM-DD') + numtoyminterval(1, 'month') FROM dual
*
ERROR at line 1:
ORA-01839: date not valid for month specified



The Function


The function itself is fairly straightforward using a combination of both add_months and numtoyminterval. When we are going from a month with more days to fewer days, then add_months yields the expected result. If we are going from a date with fewer days in the months to a date with more days in the month, then using numtoyminterval is safe because there will be no overflow.


CREATE OR REPLACE FUNCTION non_ansi_add_months
( vDate DATE,
vMonths INTEGER )
RETURN DATE AS
newDate DATE;
BEGIN
newDate := add_months(vDate, vMonths);
IF to_char(vDate, 'DD') < to_char(newDate, 'DD') THEN
newDate := vDate + numtoyminterval(vMonths, 'month');
END IF;
RETURN newDate;
END non_ansi_add_months;



The Result


This function now yields the results we expect.


SELECT non_ansi_add_months(to_date('2009-02-28','YYYY-MM-DD'), 1) FROM dual;

NON_ANSI_
---------
28-MAR-09

SELECT non_ansi_add_months(to_date('2009-01-31','YYYY-MM-DD'), 1) FROM dual;

NON_ANSI_
---------
28-FEB-09



The function even works as expected when adding negative months (calculating month intervals in the past). For example,


SELECT non_ansi_add_months(to_date('2009-02-28','YYYY-MM-DD'), -1) FROM dual;

NON_ANSI_
---------
28-JAN-09

SELECT non_ansi_add_months(to_date('2009-03-30','YYYY-MM-DD'), -1) FROM dual;

NON_ANSI_
---------
28-FEB-09

Sunday, November 2, 2008

Hibernate Updates and Oracle Stored Procedures

This is a continuation of a previous post and builds on the project setup contained therein.

Motivation: It is sometimes desirable to use Oracle stored procedures for standard CUD (create, update, delete) operations; for example, when re-factoring a database to remove trigger calls, one possible solution would be to allow modifications to tables only through procedures. The operations performed by triggers can then be moved into these procedures.

Add an Update Method


We will first use Hibernate's built-in support for updating our Author domain Object. We can add a new method to the AuthorDAO to examine its behavior.


....
public void update(final Author author) {
getHibernateTemplate().update(author);
}
....



Add a Test Case


We generally should not test the frameworks that we use, but here we will be moving outside of the framework. We want to make sure that using our custom update stored procedure will not break the existing behavior that Hibernate provides to us, so we will add a new verification of update behavior to the AuthorDAOTest.


....
@Test
public void testUpdate() throws Exception {
final Author author = getAuthorDAO().findByLastNameUsingHQL("Thoreau").get(0);
author.setLastName("Miller");
getAuthorDAO().update(author);
assertEquals(1, getAuthorDAO().findByLastNameUsingHQL("Miller").size());
}
....



Write a Stored Procedure


Now we will write our custom update stored procedure. To figure out the procedure's argument signature, we can look at the Hibernate console output when we run the test case above. We should note the order of the arguments, which is alphabetic for the updated fields with the primary key in the last position.
  • Hibernate: update MY_ORCL.AUTHOR set FIRST_NAME=?, LAST_NAME=? where ID=?

This order must be maintained in our stored procedure call. We can also use this statement as the basis for the update contained in the procedure body.
We can create this stored procedure through SQLPlus.


CREATE OR REPLACE PROCEDURE update_author
( vFirstName IN author.first_name%type,
vLastName IN author.last_name%type,
vId IN author.id%type ) AS
BEGIN
UPDATE author SET first_name=vFirstName, last_name=vLastName where id=vId;
END update_author;



Call the Stored Procedure From @SQLUpdate


Now that we have a procedure in our schema, we need a way to call it from our DAO. Hibernate provides annotations specific to these CUD operations
  • @org.hibernate.annotations.SQLUpdate
  • @org.hibernate.annotations.SQLInsert
  • @org.hibernate.annotations.SQLDelete
  • @org.hibernate.annotations.SQLDeleteAll

Here, we can add the custom update to our Author domain Object, alongside the named queries from the last tutorial.


....
@Entity
@org.hibernate.annotations.NamedNativeQuery(name = "findByLastName", query = "call findByLastName(?, :vLastName)", callable = true, resultClass = Author.class)
@javax.persistence.NamedNativeQuery(name = "findByFirstName", query = "{ ? = call findByFirstName(:vFirstName) }", resultClass = Author.class, hints = { @javax.persistence.QueryHint(name = "org.hibernate.callable", value = "true") })
@org.hibernate.annotations.SQLUpdate(sql = Author.UPDATE_AUTHOR)
@Table(name = "AUTHOR", schema = "MY_ORCL")
public class Author implements java.io.Serializable {

public static final String UPDATE_AUTHOR = "call update_author(:vFirstName, :vLastName, :vId)";
....


If we run our test case again, we will see that the stored procedure is now called to perform the update.
  • Hibernate: call update_author(:vFirstName, :vLastName, :vId)


Write a Hibernate Interceptor


Suppose we would like to pass an extra parameter to the stored procedure, say a counter for the number of times this procedure has been called from a particular instance of an application. We can modify our argument list with an extra parameter, here adding the counter at the end.


CREATE OR REPLACE PROCEDURE update_author
( vFirstName IN author.first_name%type,
vLastName IN author.last_name%type,
vId IN author.id%type,
vCounter INTEGER ) AS
BEGIN
UPDATE author SET first_name=vFirstName, last_name=vLastName where id=vId;
END update_author;



We will also need to modify the call syntax where it is declared in the domain Object to accept this new parameter.


....
public class Author implements java.io.Serializable {

public static final String UPDATE_AUTHOR = "call update_author(:vFirstName, :vLastName, :vId, :vCounter)";
....



Finally, we can write an Interceptor that will set the counter value in this parameter. The Interceptor will increment a static field each time the update_author procedure is called. This counter is passed to the stored procedure.


package spring.hibernate.oracle.stored.procedures.domain;

import java.util.concurrent.atomic.AtomicLong;
import org.hibernate.EmptyInterceptor;

public class UpdateAuthorInterceptor extends EmptyInterceptor {

private static final long serialVersionUID = 2908952460484632623L;
private static final AtomicLong counter = new AtomicLong();

@Override
public String onPrepareStatement(final String sql) {
if (sql.equals(Author.UPDATE_AUTHOR)) {
return sql.replaceFirst(":vCounter", String.valueOf(counter.getAndIncrement()));
}
return super.onPrepareStatement(sql);
}
}



Now we will configure the Interceptor in the applicationContext.xml for use by the LocalSessionFactoryBean.


....
<bean id="sessionFactory"
class="org.springframework.orm.hibernate3.LocalSessionFactoryBean">
....
<property name="entityInterceptor" ref="updateAuthorInterceptor" />
</bean>
<bean id="updateAuthorInterceptor"
class="spring.hibernate.oracle.stored.procedures.domain.UpdateAuthorInterceptor" />
....



If we run the test case again, we will see console output indicating that Hibernate is calling the update procedure with this counter value.
  • Hibernate: call update_author(:vFirstName, :vLastName, :vId, 0)


Other CUD Operations


With these same steps, we can customize Create (@SQLInsert) and Delete (@SQLDelete and @SQLDeleteAll) operations to use stored procedures.

Saturday, November 1, 2008

Solipse on Parallels

Solipse has recently been updated for the latest stable version of Eclipse Ganymede (3.4.1).
One quick note on this update:
Recently, I acquired a Mac and installed Parallels with a trial license. Installation of SXCE b91 went smoothly, except a few of the default settings needed tweaking, i.e., the memory must be at least 768MB (I bumped mine to 1024 MB) or else there are Out of Memory errors, and the network adapter must be Bridged Ethernet to connect over WiFi.
Solaris Express on Mac Parallels

Opera is the first piece of software I always install. After that, I decided to try the eclipse solaris-gtk-x86 build since the latest version supported by the solipse script was out-of-date, because a user had reported a bug in our provisioning of the SDK Profile, because I wanted to see if Eclipse would be affected at all by the changes in JDK 6u10 and because I wanted to see how well Parallels would manage with such a memory intensive process.
Downloading and installing the pre-requisite software (SunStudio 11, which I have on a local share, JDKs 1.4, 5 and 6 and Apache Ant 1.7.1, which I prefer to the default 1.6.5 installation in SXCE) went as easily as possible.
My first attempt at a 3.4.1 build did not go quite as well as I had hoped. Eventually I ended up reverting to 3.4.0 because the provisioning bug was more pressing than the version upgrade. The build time decreased by half compared to my other SXCE installation (on a dual processor P3 at 1GHz per chip with 2 GB RAM). The time went from 75 minutes to 44.
After successfully fixing the p2 bug in the script and building and provisioning 3.4.0 with JDK 6u10, my final task was to figure out what had changed in the Eclipse build between 3.4.0 and 3.4.1. Eventually I found the culprit -- a modified classpath in the org.eclipse.osgi plug-in build.xml. Yet another find-and-replace went into the solipse script along with a few more parameters for versioning information, and soon the build was humming to a happy completion.
Solaris Express on Mac Parallels

My trial period for Parallels has not yet expired! The savings in build time alone will easily make the cost of a license worthwhile....