Tuesday, December 29, 2009

What is difference between HashMap and HashTable?

Both collections implements Map. Both collections store value as key-value pairs. The key differences between the two are

1. Access to the Hashtable is synchronized on the table while access to the HashMap isn’t. You can add it, but it isn’t there by default.

2. Another difference is that iterator in the HashMap is fail-safe while the enumerator for the Hashtable isn’t. If you change the map while iterating, you’ll know. • Fail-safe – “if the Hashtable is structurally modified at any time after the iterator is created, in any way except through the iterator’s own remove method, the iterator will throw a ConcurrentModificationException”

3. HashMap permits null values and only one null key, while Hashtable doesn’t allow key or value as null.

Use of hashcode() and equals() in java

Use of hashCode() and equals().

Object class provides two methods hashcode() and equals() to represent the identity of an object. It is a common convention that if one method is overridden then other should also be implemented.

Before explaining why, let see what the contract these two methods hold. As per the Java API documentation:

*
Whenever it is invoked on the same object more than once during an execution of a Java application, the hashcode() method must consistently return the same integer, provided no information used in equals() comparisons on the object is modified. This integer need not remain consistent from one execution of an application to another execution of the same application.
*
If two objects are equal according to the equals(object) method, then calling the hashCode() method on each of the two objects must produce the same integer result.
*
It is NOT required that if two objects are unequal according to the equals(Java.lang.Object) method, then calling the hashCode() method on each of the two objects must produce distinct integer results. However, the programmer should be aware that producing distinct integer results for unequal objects may improve the performance of hashtables.

Now, consider an example where the key used to store the in Hashmap is an Integer. Consider that Integer class doesn’t implement hashcode() method. The code would look like:

map.put(new Integer(5),”Value1″);
String value = (String) map.get(new Integer(5));
System.out.println(value);
//Output : Value is null

Null value will be displayed since the hashcode() method returns a different hash value for the Integer object created at line 2and JVM tries to search for the object at different location.

Now if the integer class has hashcode() method like:

public int hashCode(){
return value;
}

Everytime the new Integer object is created with same integer value passed; the Integer object will return the same hash value. Once the same hash value is returned, JVM will go to the same memory address every time and if in case there are more than one objects present for the same hash value it will use equals() method to identify the correct object.

Another step of caution that needs to be taken is that while implementing the hashcode() method the fields that are present in the hashcode() should not be the one which could change the state of object.

Consider the example:

public class FourWheeler implements Vehicle {



private String name;

private int purchaseValue;

private int noOfTyres;

public FourWheeler(){}



public FourWheeler(String name, int purchaseValue) {

this.name = name;

this.purchaseValue = purchaseValue;

}

public void setPurchaseValue(int purchaseValue) {

this.purchaseValue = purchaseValue;

}



@Override

public int hashCode() {

final int prime = 31;

int result = 1;

result = prime * result + ((name == null) ? 0 : name.hashCode());

result = prime * result + purchaseValue;

return result;

}

}

FourWheeler fourWObj = new FourWheeler(“Santro”,”333333);
map.put(fourWObj,”Hyundai);
fourWObj.setPurchaseValue(“555555)
System.out.println(map.get(fourWObj));
//Output: null

We can see that inspite of passing the same object the value returned is null. This is because the hashcode() returned on evaluation will be different since the purchaseValue is set to ‘555555’ from ‘333333’. Hence we can conclude that the hashcode() should contain fields that doesn’t change the state of object.

One compatible, but not all that useful, way to define hashCode() is like this:

public int hashcode(){
return 0;
}

This approach will yield bad performance for the HashMap. The conclusion which can be made is that the hashcode() should(not must) return the same value if the objects are equal. If the objects are not equal then it must return different value.

Overriding equals() method

Consider the example:

public class StringHelper {



private String inputString;



public StringHelper(String string) {

inputString=string;

}



@Override

public int hashCode() {

return inputString.length();

}





public static void main(String[] args) {



StringHelper helperObj = new StringHelper(“string”);

StringHelper helperObj1 = new StringHelper(“string”);

if(helperObj.hashCode() == helperObj1.hashCode()){

System.out.println(“HashCode are equal”);

}

if(helperObj.equals(helperObj1)){

System.out.println(“Objects are equal”);

}else{

System.out.println(“Objects are not equal”);

}



}



public String getInputString() {

return inputString;

}



// Output:
HashCode are equal
Objects are not equal

We can see that even though the StringHelper object contains the same value the equals method has returned false but the hashcode method has return true value.

To prevent this inconsistency, we should make sure that we override both methods such that the contract between both methods doesn’t fail.

Steps that need to be taken into consideration while implementing equals method.

1. Use the == operator to check if the argument is a reference to this object. If so, return true. This is just a performance optimization, but one that is worth doing if the comparison is potentially expensive.

2. Use the instanceof operator to check if the argument has the correct type.

If not, return false. Typically, the correct type is the class in which the method occurs. Occasionally, it is some interface implemented by this class. Use an interface if the class implements an interface that refines the equals contract to permit comparisons across classes that implement the interface. Collection interfaces such as Set, List, Map, and Map.Entry have this property.

3. Cast the argument to the correct type. Because this cast was preceded by an instanceof test, it is guaranteed to succeed.

4. For each “significant” field in the class, checks if that field of the argument matches the corresponding field of this object. If all these tests succeed, return true; otherwise, return false

5. When you are finished writing your equals method, ask yourself three questions: Is it symmetric? Is it transitive? Is it consistent?

The correct implementation if equals method for the StringHelper class could be:

@Override

public boolean equals(Object obj) {

if (this == obj)

return true;

if (obj == null)

return false;

if (getClass() != obj.getClass())

return false;

final StringHelper other = (StringHelper) obj;

if (inputString == null) {

if (other.inputString != null)

return false;

} else if (!inputString.equals(other.inputString))

return false;

return true;

}

How HashSet works?

Yes, earlier i wasnt sure how HashSet is created internally and which data structure is used. But when I looked at the class implemenation i was surprised because of following features i noticed:

Hashset is used to store the unique elements, in which their is no gurantee of the iteration order.

Hashset internally use HashMap .

Elements passed to Hashset are stored as a key of the HashMap with null as value. Since the objects passed to set are key so no extra check is done to identify duplicates. For eg after adding integer 1 and 2 if i add 1 again, no check is performed to identify whether 1 is present or not. The hashset simply performs the put with the same value( ‘1′) in this case as key.

Similariy when an element is removed from the Set the internal HashMap remove method is called.

So HashSet data structure is nothing but a HashMap with objects as key.

HashSet Implemenation from java.util package

1. public HashSet() {
map = new HashMap();
}
2. public boolean add(E o) {
return map.put(o, PRESENT)==null;
}
3. /**
* Removes the specified element from this set if it is present.
*
* @param o object to be removed from this set, if present.
* @return true if the set contained the specified element.
*/
public boolean remove(Object o) {
return map.remove(o)==PRESENT;
}

How to create Immutable Class?

mmutable class is a class which once created, it’s contents can not be changed and cannot be inherited. Immutable objects are the objects of immutable class whose state can not be changed once constructed. e.g. String class

To create an immutable class following steps should be followed:

1. Create a final class.
2. Set the values of properties using constructor only.
3. Make the properties of the class final and private
4. Do not provide any setters for these properties.
5. If the instance fields include references to mutable objects, don’t allow those objects to be changed:
1. Don’t provide methods that modify the mutable objects.
2. Don’t share references to the mutable objects. Never store references to external, mutable objects passed to the constructor; if necessary, create copies, and store references to the copies. Similarly, create copies of your internal mutable objects when necessary to avoid returning the originals in your methods.

E.g.
public final class FinalPersonClass {

private final String name;
private final int age;

public FinalPersonClass(final String name, final int age) {
super();
this.name = name;
this.age = age;
}
public int getAge() {
return age;
}
public String getName() {
return name;
}
}

All wrapper classes in java.lang are immutable –
String, Integer, Boolean, Character, Byte, Short, Long, Float, Double, BigDecimal, BigInteger

java.lang.OutOfMemoryError while doing ant build?

While doing an ant build sometimes it happen that a user may get the OutOfMemoryError. This is mainly because the heap size of Java JVM is less. This heap size can be increased. But caution that the heap size of java JVM which is running through Ant has to be increased and not of the JVM of Jdk 1.x .

To increase the heap size of JVM in ant ,open the file ant.cmd present in Ant/bin folder.

Change following line

“%_JAVACMD%” %ANT_OPTS% -classpath “%ANT_HOME%\lib\ant-launcher.jar” “-Dant.home=%ANT_HOME%” org.apache.tools.ant.launch.Launcher %ANT_ARGS% -cp “%CLASSPATH%” %ANT_CMD_LINE_ARGS%

“%_JAVACMD%” -Xms512m -Xmx1024m %ANT_OPTS% -classpath “%ANT_HOME%\lib\ant-launcher.jar” “-Dant.home=%ANT_HOME%” org.apache.tools.ant.launch.Launcher %ANT_ARGS% -cp “%CLASSPATH%” %ANT_CMD_LINE_ARGS%
Rate This

Tuesday, November 17, 2009

Quartz security

Batch solutions are ideal for processing that is time and/or state based:

* Time-based: The business function executes on a recurring basis, running at pre-determined schedules.
* State-based: The jobs will be run when the system reaches a specific state.

Batch processes are usually data-centric and are required to handle large volumes of data off-line without affecting your on-line systems. This nature of batch processing requires proper scheduling of jobs. Quartz is a full-featured, open source job scheduling system that can be integrated with, or used along side virtually any Java Enterprise of stand-alone application. The Quartz Scheduler includes many enterprise-class features, such as JTA transactions and clustering. The following is a list of features available:
Skip to Sample Code

* Can run embedded within another free standing application
* Can be instantiated within an application server (or servlet container).
* Can participate in XA transactions, via the use of JobStoreCMT.
* Can run as a stand-alone program (within its own Java Virtual Machine), to be used via RMI
* Can be instantiated as a cluster of stand-alone programs (with load-balance and fail-over capabilities)
* Supoprt for Fail-over
* Support for Load balancing.

The following example demonstrates the use of Quartz scheduler from a stand-alone application. Follow these steps to setup the example, in Eclipse.

1. Download the latest version of quartz from opensymphony.
2. Make sure you have the following in your class path (project-properties->java build path):
* The quartz jar file (quartz-1.6.0.jar).
* Commons logging (commons-logging-1.0.4.jar)
* Commons Collections (commons-collections-3.1.jar)
* Add any server runtime to your classpath in eclipse. This is for including the Java transaction API used by Quartz. Alternatively, you can include the JTA class files in your classpath as follows
1. Download the JTA classes zip file from the JTA download page.
2. Extract the files in the zip file to a subdirectory of your project in Eclipse.
3. Add the directory to your Java Build Path in the project->preferences, as a class directory.
3. Implement a Quartz Job: A quartz job is the task that will run at the scheduled time.

public class SimpleJob implements Job

4.

{

5.

public void execute(JobExecutionContext ctx) throws JobExecutionException {

6.

System.out.println("Executing at: " + Calendar.getInstance().getTime() + " triggered by: " + ctx.getTrigger().getName());

7.

}

8.

}

SimpleJob.java
9. The following piece of code can be used to run the job using a scheduler.

public class QuartzTest {

10.

public static void main(String[] args) {

11.

try {

12.

// Get a scheduler instance.

13.

SchedulerFactory schedulerFactory = new StdSchedulerFactory();

14.

Scheduler scheduler = schedulerFactory.getScheduler();

15.

long ctime = System.currentTimeMillis();

16.

// Create a trigger.

17.

JobDetail jobDetail = new JobDetail("Job Detail", "jGroup", SimpleJob.class);

18.

SimpleTrigger simpleTrigger = new SimpleTrigger("My Trigger", "tGroup");

19.

simpleTrigger.setStartTime(new Date(ctime));

20.

// Set the time interval and number of repeats.

21.

simpleTrigger.setRepeatInterval(100);

22.

simpleTrigger.setRepeatCount(10);

23.

// Add trigger and job to Scheduler.

24.

scheduler.scheduleJob(jobDetail, simpleTrigger);

25.

// Start the job.

26.

scheduler.start();

27.

} catch (SchedulerException ex)

28.

{ ex.printStackTrace();

29.

}

30.

}

31.

}

Spring Acegi Security Framework

You can find Examples here :

http://stuffjava.blogspot.com/2009/09/spring-acegi-security-framework.html

http://stuffjava.blogspot.com/2008/07/acegi-security-in-one-hour-concise.html

Generic Data Access Objects in hibernate

The DAO interfaces
An implementation with Hibernate
Preparing DAOs with factories
Preparing DAOs with manual dependency injection
Preparing DAOs with lookup
Writing DAOs as managed EJB 3.0 components

This time I based the DAO example on interfaces. Tools like Hibernate already provide database portability, so persistence layer portability shouldn't be a driving motivation for interfaces. However, DAO interfaces make sense in more complex applications, when several persistence services are encapsulate in one persistence layer. I'd say that you should use Hibernate (or Java Persistence APIs) directly in most cases, the best reason to use an additional DAO layer is higher abstraction (e.g. methods like getMaximumBid() instead of session.createQuery(...) repeated a dozen times).
The DAO interfaces

I use one interface per persistent entity, with a super interface for common CRUD functionality:

public interface GenericDAO {

T findById(ID id, boolean lock);

List findAll();

List findByExample(T exampleInstance);

T makePersistent(T entity);

void makeTransient(T entity);
}

You can already see that this is going to be a pattern for a state-oriented data access API, with methods such as makePersistent() and makeTransient(). Furthermore, to implement a DAO you have to provide a type and an identifier argument. As for most ORM solutions, identifier types have to be serializable.

The DAO interface for a particular entity extends the generic interface and provides the type arguments:

public interface ItemDAO extends GenericDAO {

public static final String QUERY_MAXBID = "ItemDAO.QUERY_MAXBID";
public static final String QUERY_MINBID = "ItemDAO.QUERY_MINBID";

Bid getMaxBid(Long itemId);
Bid getMinBid(Long itemId);

}

We basically separate generic CRUD operations and actual business-related data access operations from each other. (Ignore the named query constants for now, they are convenient if you use annotations.) However, even if only CRUD operations are needed for a particular entity, you should still write an interface for it, even it it is going to be empty. It is important to use a concrete DAO in your controller code, otherwise you will face some refactoring once you have to introduce specific data access operations for this entity.
An implementation with Hibernate

An implementation of the interfaces could be done with any state-management capable persistence service. First, the generic CRUD implementation with Hibernate:

public abstract class GenericHibernateDAO
implements GenericDAO {

private Class persistentClass;
private Session session;

public GenericHibernateDAO() {
this.persistentClass = (Class) ((ParameterizedType) getClass()
.getGenericSuperclass()).getActualTypeArguments()[0];
}

@SuppressWarnings("unchecked")
public void setSession(Session s) {
this.session = s;
}

protected Session getSession() {
if (session == null)
throw new IllegalStateException("Session has not been set on DAO before usage");
return session;
}

public Class getPersistentClass() {
return persistentClass;
}

@SuppressWarnings("unchecked")
public T findById(ID id, boolean lock) {
T entity;
if (lock)
entity = (T) getSession().load(getPersistentClass(), id, LockMode.UPGRADE);
else
entity = (T) getSession().load(getPersistentClass(), id);

return entity;
}

@SuppressWarnings("unchecked")
public List findAll() {
return findByCriteria();
}

@SuppressWarnings("unchecked")
public List findByExample(T exampleInstance, String[] excludeProperty) {
Criteria crit = getSession().createCriteria(getPersistentClass());
Example example = Example.create(exampleInstance);
for (String exclude : excludeProperty) {
example.excludeProperty(exclude);
}
crit.add(example);
return crit.list();
}

@SuppressWarnings("unchecked")
public T makePersistent(T entity) {
getSession().saveOrUpdate(entity);
return entity;
}

public void makeTransient(T entity) {
getSession().delete(entity);
}

public void flush() {
getSession().flush();
}

public void clear() {
getSession().clear();
}

/**
* Use this inside subclasses as a convenience method.
*/
@SuppressWarnings("unchecked")
protected List findByCriteria(Criterion... criterion) {
Criteria crit = getSession().createCriteria(getPersistentClass());
for (Criterion c : criterion) {
crit.add(c);
}
return crit.list();
}

}

There are some interesting things in this implementation. First, it clearly needs a Session to work, provided with setter injection. You could also use constructor injection. How you set the Session and what scope this Session has is of no concern to the actual DAO implementation. A DAO should not control transactions or the Session scope.

We need to suppress a few compile-time warnings about unchecked casts, because Hibernate's interfaces are JDK 1.4 only. What follows are the implementations of the generic CRUD operations, quite straightforward. The last method is quite nice, using another JDK 5.0 feature, varargs. It helps us to build Criteria queries in concrete entity DAOs. This is an example of a concrete DAO that extends the generic DAO implementation for Hibernate:

public class ItemDAOHibernate
extends GenericHibernateDAO
implements ItemDAO {

public Bid getMaxBid(Long itemId) {
Query q = getSession().getNamedQuery(ItemDAO.QUERY_MAXBID);
q.setParameter("itemid", itemId);
return (Bid) q.uniqueResult();
}

public Bid getMinBid(Long itemId) {
Query q = getSession().getNamedQuery(ItemDAO.QUERY_MINBID);
q.setParameter("itemid", itemId);
return (Bid) q.uniqueResult();
}

}

Another example which uses the findByCriteria() method of the superclass with variable arguments:

public class CategoryDAOHibernate
extends GenericHibernateDAO
implements CategoryDAO {

public Collection findAll(boolean onlyRootCategories) {
if (onlyRootCategories)
return findByCriteria( Expression.isNull("parent") );
else
return findAll();
}
}

Preparing DAOs with factories

We could bring it all together in a DAO factory, which not only sets the Session when a DAO is constructed but also contains nested classes to implement CRUD-only DAOs with no business-related operations:

public class HibernateDAOFactory extends DAOFactory {

public ItemDAO getItemDAO() {
return (ItemDAO)instantiateDAO(ItemDAOHibernate.class);
}

public CategoryDAO getCategoryDAO() {
return (CategoryDAO)instantiateDAO(CategoryDAOHibernate.class);
}

public CommentDAO getCommentDAO() {
return (CommentDAO)instantiateDAO(CommentDAOHibernate.class);
}

public ShipmentDAO getShipmentDAO() {
return (ShipmentDAO)instantiateDAO(ShipmentDAOHibernate.class);
}

private GenericHibernateDAO instantiateDAO(Class daoClass) {
try {
GenericHibernateDAO dao = (GenericHibernateDAO)daoClass.newInstance();
dao.setSession(getCurrentSession());
return dao;
} catch (Exception ex) {
throw new RuntimeException("Can not instantiate DAO: " + daoClass, ex);
}
}

// You could override this if you don't want HibernateUtil for lookup
protected Session getCurrentSession() {
return HibernateUtil.getSessionFactory().getCurrentSession();
}

// Inline concrete DAO implementations with no business-related data access methods.
// If we use public static nested classes, we can centralize all of them in one source file.

public static class CommentDAOHibernate
extends GenericHibernateDAO
implements CommentDAO {}

public static class ShipmentDAOHibernate
extends GenericHibernateDAO
implements ShipmentDAO {}

}

This concrete factory for Hibernate DAOs extends the abstract factory, which is the interface we'll use in application code:

public abstract class DAOFactory {

/**
* Creates a standalone DAOFactory that returns unmanaged DAO
* beans for use in any environment Hibernate has been configured
* for. Uses HibernateUtil/SessionFactory and Hibernate context
* propagation (CurrentSessionContext), thread-bound or transaction-bound,
* and transaction scoped.
*/
public static final Class HIBERNATE = org.hibernate.ce.auction.dao.hibernate.HibernateDAOFactory.class;

/**
* Factory method for instantiation of concrete factories.
*/
public static DAOFactory instance(Class factory) {
try {
return (DAOFactory)factory.newInstance();
} catch (Exception ex) {
throw new RuntimeException("Couldn't create DAOFactory: " + factory);
}
}

// Add your DAO interfaces here
public abstract ItemDAO getItemDAO();
public abstract CategoryDAO getCategoryDAO();
public abstract CommentDAO getCommentDAO();
public abstract ShipmentDAO getShipmentDAO();

}

Note that this factory example is suitable for persistence layers which are primarily implemented with a single persistence service, such as Hibernate or EJB 3.0 persistence. If you have to mix persistence APIs, for example, Hibernate and plain JDBC, the pattern changes slightly. Keep in mind that you can also call session.connection() inside a Hibernate-specific DAO, or use one of the many bulk operation/SQL support options in Hibernate 3.1 to avoid plain JDBC.

Finally, this is how data access now looks like in controller/command handler code (pick whatever transaction demarcation strategy you like, the DAO code doesn't change):

// EJB3 CMT: @TransactionAttribute(TransactionAttributeType.REQUIRED)
public void execute() {

// JTA: UserTransaction utx = jndiContext.lookup("UserTransaction");
// JTA: utx.begin();

// Plain JDBC: HibernateUtil.getCurrentSession().beginTransaction();

DAOFactory factory = DAOFactory.instance(DAOFactory.HIBERNATE);
ItemDAO itemDAO = factory.getItemDAO();
UserDAO userDAO = factory.getUserDAO();

Bid currentMaxBid = itemDAO.getMaxBid(itemId);
Bid currentMinBid = itemDAO.getMinBid(itemId);

Item item = itemDAO.findById(itemId, true);

newBid = item.placeBid(userDAO.findById(userId, false),
bidAmount,
currentMaxBid,
currentMinBid);

// JTA: utx.commit(); // Don't forget exception handling

// Plain JDBC: HibernateUtil.getCurrentSession().getTransaction().commit(); // Don't forget exception handling

}

The database transaction, either JTA or direct JDBC, is started and committed in an interceptor that runs for every execute(), following the Open Session in View pattern. You can use AOP for this or any kind of interceptor that can be wrapped around a method call, see Session handling with AOP.
Preparing DAOs with manual dependency injection

You don't need to write the factories. You can as well just do this:

// EJB3 CMT: @TransactionAttribute(TransactionAttributeType.REQUIRED)
public void execute() {

// JTA: UserTransaction utx = jndiContext.lookup("UserTransaction");
// JTA: utx.begin();

// Plain JDBC: HibernateUtil.getCurrentSession().beginTransaction();

ItemDAOHibernate itemDAO = new ItemDAOHibernate();
itemDAO.setSession(HibernateUtil.getSessionFactory().getCurrentSession());

UserDAOHibernate userDAO = new UserDAOHibernate();
userDAO.setSession(HibernateUtil.getSessionFactory().getCurrentSession());

Bid currentMaxBid = itemDAO.getMaxBid(itemId);
Bid currentMinBid = itemDAO.getMinBid(itemId);

Item item = itemDAO.findById(itemId, true);

newBid = item.placeBid(userDAO.findById(userId, false),
bidAmount,
currentMaxBid,
currentMinBid);

// JTA: utx.commit(); // Don't forget exception handling

// Plain JDBC: HibernateUtil.getCurrentSession().getTransaction().commit(); // Don't forget exception handling

}

The disadvantage here is that the implementation classes (i.e. ItemDAOHibernate and UserDAOHibernate) of the persistence layer are exposed to the client, the controller. Also, constructor injection of the current Session might be more appropriate.
Preparing DAOs with lookup

Alternatively, call HibernateUtil.getSessionFactory().getCurrentSession() as a fallback, if the client didn't provide a Session when the DAO was constructed:

public abstract class GenericHibernateDAO
implements GenericDAO {

private Class persistentClass;
private Session session;

public GenericHibernateDAO() {
this.persistentClass = (Class) ((ParameterizedType) getClass()
.getGenericSuperclass()).getActualTypeArguments()[0];
}

public void setSession(Session session) {
this.session = session;
}

protected void getSession() {
if (session == null)
session = HibernateUtil.getSessionFactory().getCurrentSession();
return session;
}

...

The controller now uses these stateless data access objects through direct instantiation:

// EJB3 CMT: @TransactionAttribute(TransactionAttributeType.REQUIRED)
public void execute() {

// JTA: UserTransaction utx = jndiContext.lookup("UserTransaction");
// JTA: utx.begin();

// Plain JDBC: HibernateUtil.getCurrentSession().beginTransaction();

ItemDAO itemDAO = new ItemDAOHibernate();
UserDAO userDAO = new UserDAOHibernate();

Bid currentMaxBid = itemDAO.getMaxBid(itemId);
Bid currentMinBid = itemDAO.getMinBid(itemId);

Item item = itemDAO.findById(itemId, true);

newBid = item.placeBid(userDAO.findById(userId, false),
bidAmount,
currentMaxBid,
currentMinBid);

// JTA: utx.commit(); // Don't forget exception handling

// Plain JDBC: HibernateUtil.getCurrentSession().getTransaction().commit(); // Don't forget exception handling

}

The only disadvantage of this very simple strategy is that the implementation classes (i.e. ItemDAOHibernateUserDAOHibernate) of the persistence layer are again exposed to the client, the controller. You can still supply a custom Session if needed (integration test, etc). and

Each of these methods (factories, manual injection, lookup) for setting the current Session and creating a DAO instance has advantages and drawbacks, use whatever you feel most comfortable with.

Naturally, the cleanest way is managed components and EJB 3.0 session beans:
Writing DAOs as managed EJB 3.0 components

Turn your DAO superclass into a base class for stateless session beans (all your concrete DAOs are then stateless EJBs, they already have a business interface). This is basically a single annotation which you could even move into an XML deployment descriptor if you like. You can then use dependency injection and get the "current" persistence context provided by the container:

@Stateless
public abstract class GenericHibernateDAO
implements GenericDAO {

private Class persistentClass;

@PersistenceContext
private EntityManager em;

public GenericHibernateDAO() {
setSession( (Session)em.getDelegate() );
}

...

You can then cast the delegate of an EntityManager to a Hibernate Session.

This only works if you use Hibernate as a Java Persistence provider, because the delegate is the SessionSession injected directly. If you use a different Java Persistence provider, rely on the EntityManager API instead of Session. Now wire your DAOs into the controller, which is also a managed component: API. In JBoss AS you could even get a

@Stateless
public class ManageAuctionController implements ManageAuction {

@EJB ItemDAO itemDAO;
@EJB UserDAO userDAO;

@TransactionAttribute(TransactionAttributeType.REQUIRED) // This is even the default
public void execute() {

Bid currentMaxBid = itemDAO.getMaxBid(itemId);
Bid currentMinBid = itemDAO.getMinBid(itemId);

Item item = itemDAO.findById(itemId, true);

newBid = item.placeBid(userDAO.findById(userId, false),
bidAmount,
currentMaxBid,
currentMinBid);

}
}

Monday, October 26, 2009

How will you store your exception stack trace in a file?

ANS:

try
{
printWriter=new PrintWriter(”D:/Exception.txt”);
int i=10,j=0;
int res=i/j;
}
catch(Exception e)
{
StackTraceElement[]str=e.getStackTrace(); printWriter.write(e.getMessage());
printWriter.println(”————-”);
for(int i=0; i < str.length ;i++ )
{ printWriter.write(str[i].toString()); printWriter.write(str[i].getLineNumber()); printWriter.println();
}
printWriter.close();
}

it will generates a file name Exception.txt in D:

how can we display our web pages(jsp,xhtml extc) always from top ?

Using following script we can display our web pages(jsp,xhtml etc.) always from top.

window.scroll(0,0);

Creating BeanFacroty, Application Context and WebApplicationContext

Creating BeanFactory:

ClassPathResource res = new ClassPathResource("beans.xml");
XmlBeanFactory factory = new XmlBeanFactory(res);

Creating Application Context:

ClassPathXmlApplicationContext : It Loads context definition from an XML file located in the classpath
ApplicationContext context = new ClassPathXmlApplicationContext("bean_test.xml");

*FileSystemXmlApplicationContext : It loads context definition from an XML file in the filesystem.
ApplicationContext context = new FileSystemXmlApplicationContext("bean_test.xml");


Creating WebApplicationContext;

WebApplicationContext springApplicationContext = WebApplicationContextUtils
.getWebApplicationContext(config.getContext());

Diff b/w Session and SessionFactory

The application obtains Session instances from a SessionFactory. There is typically a single SessionFactory for the whole application—created during application initialization. The SessionFactory caches generate SQL statements and other mapping
metadata that Hibernate uses at runtime. It also holds cached data that has been read in one unit of work and may be reused in a future unit of work

SessionFactory sessionFactory = configuration.buildSessionFactory();

The Session interface is the primary interface used by Hibernate applications. It is a single-threaded, short-lived object representing a conversation between the application and the persistent store. It allows you to create query objects to
retrieve persistent objects.

Lucene search Example

Creating Index Files;

Document document = new Document();
document.add(Field.Text("author", author));
document.add(Field.Text("title", title));
document.add(Field.Text("topic", topic));
document.add(Field.UnIndexed("url", url));
document.add(Field.Keyword("date", dateWritten));
document.add(Field.UnStored("article", article));
return document;

Analyzer analyzer = new StandardAnalyzer();
IndexWriter writer = new IndexWriter(indexDirectory, analyzer, false);
writer.addDocument(document);
writer.optimize();
writer.close();


Searching string :

String searchCriteria="some artical name";
IndexSearcher is = new IndexSearcher(indexDirectory);
Analyzer analyzer = new StandardAnalyzer();
QueryParser parser = new QueryParser("article", analyzer);
Query query = parser.parse(searchCriteria);
Hits hits = is.search(query);


for (int i=0; i < hits.length(); i++ )
{
Document doc = hits.doc(i);
// display the articles that were found to the user
}
is.close();


for more information you can find Here

Wednesday, September 23, 2009

Inheritance Types in Hibernate with .hbm and annotation




Inheritance Types in Hibernate
1. Table per concrete class with implicit polymorphism
2. Table per concrete class with unions (same primary key for both tables)
3. Table per class hierarchy
4. Table per subclass
1. Table per concrete class with implicit polymorphism

Here Primary key value is different for two tables.
For a query against the BillingDetails class Hibernate uses the following SQL:
select CREDIT_CARD_ID, OWNER, NUMBER, EXP_MONTH, EXP_YEAR ...
from CREDIT_CARD
select BANK_ACCOUNT_ID, OWNER, ACCOUNT, BANKNAME, ...
from BANK_ACCOUNT
@MappedSuperclass
public abstract class BillingDetails {
@Column(name = "OWNER", nullable = false)
private String owner;
...
}
@Entity
@AttributeOverride(name = "owner", column =
@Column(name = "CC_OWNER", nullable = false))
public class CreditCard extends BillingDetails {
@Id @GeneratedValue
@Column(name = "CREDIT_CARD_ID")
private Long id = null;
@Column(name = "NUMBER", nullable = false)
private String number;
...
}
2. Table per concrete class with unions
Here same primary key is shared for both tables.

...

...


Here same primary key is shared for both tables (the below one is equal configuration to union).
Super class:
@Entity
@Inheritance(strategy = InheritanceType.TABLE_PER_CLASS)
public abstract class BillingDetails {
@Id @GeneratedValue
@Column(name = "BILLING_DETAILS_ID")
private Long id = null;
@Column(name = "OWNER", nullable = false)
private String owner;
...
}
Subclass:
@Entity
@Table(name = "CREDIT_CARD")
public class CreditCard extends BillingDetails {
@Column(name = "NUMBER", nullable = false)
private String number;
...
}
A query for BillingDetails executes the following SQL statement:
select
BILLING_DETAILS_ID, OWNER, NUMBER, EXP_MONTH, EXP_YEAR,ACCOUNT, BANKNAME, SWIFT CLAZZ_
from
( select BILLING_DETAILS_ID, OWNER,NUMBER, EXP_MONTH, EXP_YEAR,
null as ACCOUNT, null as BANKNAME, null as SWIFT,1 as CLAZZ_
from
CREDIT_CARD
union
select BILLING_DETAILS_ID, OWNER,null as NUMBER, null as EXP_MONTH, null as EXP_YEAR, ...ACCOUNT, BANKNAME, SWIFT,2 as CLAZZ_
from
BANK_ACCOUNT
)
3. Table per class

...

...



Super Class:
@Entity
@Inheritance(strategy = InheritanceType.SINGLE_TABLE)
@DiscriminatorColumn( name = "BILLING_DETAILS_TYPE",
discriminatorType = DiscriminatorType.STRING)
public abstract class BillingDetails {
@Id @GeneratedValue
@Column(name = "BILLING_DETAILS_ID")
private Long id = null;
@Column(name = "OWNER", nullable = false)
private String owner;
...
}
SubClass:
@Entity
@DiscriminatorValue("CC")
public class CreditCard extends BillingDetails {
@Column(name = "CC_NUMBER")
private String number;
...
}
4. Table per

...

...



Hibernate relies on an outer join when querying the BillingDetails class:
select BD.BILLING_DETAILS_ID, BD.OWNER,
CC.NUMBER, CC.EXP_MONTH, ..., BA.ACCOUNT, BA.BANKNAME, ...
case
when CC.CREDIT_CARD_ID is not null then 1
when BA.BANK_ACCOUNT_ID is not null then 2
when BD.BILLING_DETAILS_ID is not null then 0
end as CLAZZ_ from BILLING_DETAILS BD
left join CREDIT_CARD CC
on BD.BILLING_DETAILS_ID = CC.CREDIT_CARD_ID
left join BANK_ACCOUNT BA
on BD.BILLING_DETAILS_ID = BA.BANK_ACCOUNT_ID

@Entity
@Inheritance(strategy = InheritanceType.JOINED)
public abstract class BillingDetails {
@Id @GeneratedValue
@Column(name = "BILLING_DETAILS_ID")
private Long id = null;
...
}
@Entity
public class BankAccount {
...
}
This entity has no identifier property; it automatically inherits the BILLING_DETAILS_ID property and column from the superclass, and Hibernate knows how to join the tables together if you want to retrieve instances of BankAccount. Of course, you can specify the column name explicitly: