Pangarchulla Trek

Pangarchulla Peak is a 5 days trek. It is in Garhwal region of Uttrakhand. Located near Joshimath in Chamoli District.  Its difficulty level is moderate to difficult. Pangarchulla Peak’s height is around 4498 m (~14,700 ft).

It is winter trek. Day time temperature range is around 3°C to 10°C and night temperature range is around 1°C to -10°C.  Trek starts with greener terrain. It is initially covered with trees. But as you move higher, trees becomes lesser and lesser and snow starts to increase on trek. On the summit day most of the trek will be fully covered with Snow.  Trek is surrounded by beautiful mountains. On the way, views of Nandadevi Peak, Dronagiri Peak , Chaukhamba and many more can be seen.

We went there in the last week of December-2017. But I would say, best time to do this trek is probably around Feb, April. Because till Feb, April many boulders and paths will be fully covered with snow. Continue reading

It’s not always about Summit

Summit, the ultimate target. Everyone wants to reach there.
Reaching there gives  “a joy of accomplishment, sheer happiness, self-confidence that yes I can, assurance of being fit enough, feeling of conquering own fears”.

I had always wondered, what if someday I’ll not be able to reach till Summit? How would I feel?

My imagination was that I would feel bit sad, probably very sad, angry, lost. Millions of thoughts would continuously run through my mind in background which would make it more worst. ( should I have walked fast, should I have done more practice before trek, what went wrong, what could had been done in different way, etc, etc … )

Continue reading

Git Clone Error : unable to get local issuer certificate

Recently I was cloning an internal git repository and got the below error :
Git Clone Error : unable to get local issuer certificate
On linux (ubuntu) this error message was:
fatal: unable to access ‘https://***.git/’: server certificate verification failed. CAfile: /etc/ssl/certs/ca-certificates.crt CRLfile: none
After bit of google search, I tried few things and was able to solve it. I am noting down here steps for future use. It should not take more than 5 mins to solve.

Why part:

Before I go to the fix, first it needs to be understood why this error came in first place.
Git comes by default with some predefined ca-bundle. So these will be the list of CAs (Certificate Authorities) git will be trusting while making the SSL connection to the git repository. While most of the public repository will have their SSL certificate signed by known CA. Internal repository had self-signed certificate.
Since it was self-signed certificate, it’s CA was not in git’s trusted CA list, so git refused to trust the SSL connection and dropped the request there and then.

Fix:

Export certificate in PEM format from the browser

Open the repository your are trying to access in browser. I used mozilla firefox. On the top left corner of firefox, there will be a lock button.
Click on lock button -> click on arrow(>) -> More information .
From there Securty -> View Certificate -> Details -> Export Certificate.
While exporting certificate select format : X.509 certificate with chain (PEM)  and save it at some known location.

Find the ca-bundles file location used by git

Windows:
It is generally located at : ${git_home_directory}mingw64/ssl/certs/ca-bundle.crt
You can find out actual path by executing the command : git config –list
Look for the line starting with http.sslcainfo , on my machine it was :
http.sslcainfo=C:/Program Files/Git/mingw64/ssl/certs/ca-bundle.crt

Linux:
On ubuntu the path is /etc/ssl/certs/ca-certificates.crt .. It is also shown in the error message.

Copy the exported certificate into ca-bundle

Open the certificate you exported from browser in notepad or similar text tool. Copy everything, including — BEGIN CERTIFICATE — till — END CERTIFICATE —
Open the ca-bundle file (ca-bundles.crt) file and in the end paste all copied content.
Save the ca-bundle file.

Now restart your terminal and try whatever git command you were trying and it should work !! Bingo !! Enjoy !!

How to improve TPS of UDP Server

Once upon a time I was given a task to improve TPS (transaction per second) of a UDP Server. UDP Server was a very simple program written in Java using datagram socket server. Program was receiving udp message and doing some replacement in the message. Each message was of a small size(around 1024 bytes).
But huge packet loss was being reported on client side.

Analysis for finding the root cause was as follows :

  • Being UDP server, minor percentage of packet loss is acceptable and if network is heavily loaded, then its unavoidable. But in this case around 80%-90% packets were being dropped. Sysadmin also checked network congestion and it was withing normal limits. So something else was definitely going wrong
  • Server machine configuration was also quite good. (64 GB RAM, 4 core cpu with 2.x GHz clock speed ),  network card attached was of 10Gbps) so there does not  seemed to be an issue with the machine hardware configuration.
  • Other area of suspicion was multi-threading. Program was single threaded. It was reading message and processing. So first thing which comes to every programmer is that let’s try multi-threading. I also went ahead with that though process and started looking at the code to make it multi-threaded.
    • But when I looked at code, I realized that it was just doing simple find replace after reading the message, which should not take more than few instructions on processor. 80% around package loss due to processing time was very much unlikely.
    • After looking at the code, it seemed that if code was made multi-threaded, it might increase the overall time, instead of decreasing it, due to the synchronization, thread switch and other overhead introduced by multi-threading
  • At this point, I was sure that packet loss is not happening due to network congestion, hardware configuration, long message processing time. So what is going wrong than?? why packet loss ??
  • So I started looking at all the steps involved in UDP packet processing. It has got mainly 2 steps
    • Packets are first received by OS and buffered
    • Program reads packets from this queue, in this case java program’s datagram socket reads from the OS packet buffer
  • Since the program was very simpler one, I started exploring about first point and try to find some issue over there
  • I found that in linux systems, UDP and TCP packets which are received are queued.
    • There is separate buffer for reading packets and writing packets
    • Each buffer has some default memory allocated, but we can configure that memory as per our requirements.
  • Now this could be the problem, if there is not enough memory allocated to the buffer then packet loss can occur, consider following scenarios
    • Packets are coming at higher speed than the processing capability of the program. If that is the case then packets will be buffered for some time and when the buffer becomes full, packet starts to drop
    • Packet are coming at much higher rate, packet rate is so higher that even if processing is able to cope up with the packet rate, but there is not enough memory to store it. I will elaborate this point with an example
      • Suppose packets are coming at rate of 100 packets/second
      • Program is able to process 100 packets/second
      • But OS has allocated memory to read buffer such that only 10 packets can be stored
        • Now say on the 1st second 10 packets will come, program will process that packets and everything works fine
        • 2nd second 100 packets will come simultaneously, but os buffer can store only 10 packets at a time, so it will store 10 packets and drop rest of the 90 packets. Program will process that 10 packets
        • Here though the program is capable of processing 100 packets, packet loss occurs due to lower memory allocated to OS read buffer. This situation is out of scope for any program and needs to be handled at OS level
  • Linux has configuration parameter for fine tuning the UDP and TCP read/writer buffer. Following parameters were changed :
    • net.core.rmem_default : The default setting of the socket receive buffer in bytes
    • net.core.wmem_default : The default setting(in bytes) of the socet send buffer
    • net.core.rmem_max : The maximum receive socket bufer size in bytes
    • net.core.wmem_max : The maximum send socket buffer in bytes.
    • net.core.netdev_max_backlog : Maximum number of packets, queued on the INPUT side, when the interface receives packets faster than kernel can process them.

Following are the commands to change parameters on RHEL:

  • sysctl -w net.core.rmem_default=73400320
  • sysctl -w net.core.wmem_default=73400320
  • sysctl -w net.core.rmem_max=73400320
  • sysctl -w net.core.wmem_max=73400320
  • sysctl -w net.core.netdev_max_backlog=3000

Above values should also be specified in /etc/sysctl.conf file so that values will be persisted during machine restart.

Follow below steps to add these values in /etc/sysctl.conf file

  • Open the /etc/sysctl.conf file
    • vi /etc/sysctl.conf
  • Now add the following properties in the file if not already present, if any property is already present in the file then please change the value for that property in the file
    • core.rmem_default = 73400320
    • core.wmem_default = 73400320
    • core.rmem_max = 73400320
    • core.wmem_max = 73400320
    • core.netdev_max_backlog = 3000
  • Save the /etc/sysctl.conf file

Here in my case I had configured 70MB of buffer and max_backlog of 3000, which turned out to be sufficient. Depending on the traffic requirement this might need to be adjusted.

Before these changes UDP server’s TPS was around 1000, after these changes TPS went up to 9000 and still there was a potential if traffic increases.

 

Eclipse useful shortcuts

Some of the useful shortcuts which I use :

  • Ctrl + Shft + r – Open Resource
  • Ctrl + Shft + t – Open Type
  • Ctrl + Shft + s – Then getter and setters
  • Ctrl + d – Remove Line
  • Ctrl + L – Go to the line number
  • Ctrl + / – Comment/Uncomment code
  • Ctrl + Shift + / – Add/Remove Block comment
  • Ctrl + O – Go to method/variable declration
  • Ctrl + Shft + O – Organize the Inputs
  • Ctrl + Shft + F – Formats the code
  • Alt + Left/Right Arrow – Move between last edited and where u were
  • Alt + Up/down arrow – Move the selection to up/down
  • Ctrl + 1 – Quick fix, Its important while implementing the interface methods
  • Ctrl + E – List of all the open edits
  • Ctrl + F6 (Move between Editor), Ctrl + F7 (Move between Views), Ctrl + F8 (Move between perspective)
  • Ctrl + F11 – Run the application
  • Ctrl + M – Maximize/Minimize the current tab
  • Ctrl + N – Create New Resouce
  • Ctrl + I – Corrects the intendataion
  • Ctrl + J – Incremental Search
  • Ctrl + Shift + L – Shows currently defined shortcut keys
  • F12 – Activate Editor
  • Ctrl + Shift + M – Add imports
  • Alt + Shft + J – Add JavaDoc Comment
  • F3 – Go to the Declration
  • Ctrl + Shift + W – Close all windows
  • Ctrl + r  – Run till the cursor (Useful during debugging the code)

 

Linux – Comparing two files byte by byte


cmp:
command can be used to compare two files byte by byte.
Some useful option of this command, which I use are as below :

-b : print differing bytes
-l : output byte numbers and values of all differing bytes

Example: 

cat test1.txt 
Hello World
cat test2.txt
hello world

cmp with no option set

cmp test1.txt test2.txt 
test1.txt test2.txt differ: byte 1, line 1

cmp with verbose output

cmp -bl test1.txt test2.txt 
1 110 H 150 h
7 127 W 167 w

 

 

Accessing spring managed beans from non spring managed classes

There may be situation when some part of application uses Springs for dependency injection and bean management and other part of application is outside the spring managed environment. If classes which are not managed by Spring requires access to the Spring managed classes, then directory creating instance of spring managed class with new operator will not work. Because if we use new for creating instance, then spring will not be able to wire the dependency correctly.

In such situations we need to create bridge between Spring managed beans and classes outside the spring management scope. Following is the code for accessing Spring Managed beans from non Spring managed classes.

import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.stereotype.Component;

/**
 * Bridge Between Spring Managed beans and non spring managed classes
 * @author Pranav Maniar
 */
@Component
public class SpringBridge implements ApplicationContextAware {

    // Maintain reference to Spring's Application Context
    private static ApplicationContext context;

    public void setApplicationContext(ApplicationContext context)
            throws BeansException {
        this.context = context;
    }

    // Make constructor private, so that the class can not be instantiated
    private SpringBridge() {
    }

    /**
     * Get Spring Managed bean from Non Spring Managed (outside of spring) classes
     * 
     * @param type, Class of Bean whose instance is required
     * @return Spring managed bean
     */
    public static <T> T getBean(Class<T> type) {
        return (T) context.getBean(type);
    }
}

For e.g. If there is Spring managed bean UserService, which needs to be accessed from some other class. In following way spring managed beans can be accessed in outside of spring (non managed) class.

 UserService userService = SpringBridge.getBean(UserService.class);

 

How does it work?

Spring Managed Component

First SpringBridge needs to be registered as spring managed component. In the above code it is done by using @Component annotation on top of the SpringBridge class. Also, SpringBridge class needs to be on the location of Spring’s component scan, so that spring can recognize it as spring managed component.

ApplicationContextAware Interface

By implementing this interface object will be notified of the ApplicationCotext that it runs in. Normally setApplicationContext method will be invoked after population of normal bean properties but before an init callback. SpringBridge class stores reference to the applicationContext in the static variable.  Using this applicationContext any bean can be looked up.

Generic method for returning the bean

Static method public static <T> T getBean(Class<T> type)  takes class name as a parameter and returns the instance of the bean. It looks up for the bean in ApplicationContext and if instance is found there, then it is returned.

Build Hadoop 2.7.x from source code on ubuntu

Following are the steps for building Hadoop 2.7.x from source code on ubuntu.

Checout Hadoop Code from Git

git clone git://git.apache.org/hadoop.git
git checkout branch-2.7.3

 

Install Dependencies

Install dependencies from apt-get

sudo apt-get update
sudo apt-get install openjdk-7-jdk maven git openssl dh-autoreconf cmake zlib1g-dev libssl-dev ssh rsync pkg-config

Install protocol buffer

wget https://github.com/google/protobuf/archive/v2.5.0.tar.gz
tar xvf v2.5.0.tar.gz
cd protobuf-2.5.0
./autogen.sh
./configure --prefix=/usr
make
make install

Install findbugs

wget https://sourceforge.net/projects/findbugs/files/findbugs/3.0.0/findbugs-3.0.0.zip/download
unzip download

 

Set Environment Variables

vim ~/.bashrc

## Put following lines at end of ~/.bashrc file
## Please change the path with appropriate value for your installation

export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64/
export FINDBUGS_HOME=/opt/findbugs-3.0.0/

 

Build Hadoop Distribution

mvn clean install -Pdist,native,docs,src -Dtar -DskipTests

Hadoop tar.gz distribution will be created in hadoop-dist/target/ directory

Hibernate GenericDao

Need for GenericDao

When we use hibernate in a typical J2EE project, there are some basic operations which must be supported on all hibernate entities. These operations are create, update, delete, find, findAll entities.  A generic code can be written which performs these basic operations for the all entities. This approach has following advantages

  • Generic code allows in removing the code duplication. It will save lot of time spent in writing/testing duplicate code.
  • It ensures the consistent interface. Consistent interface helps developer understand and implement code much faster

Sometimes the second point consistent interface does not seem too much of value, but I have seen projects where the interface was not consistent and due to which developer spent hours and hours of time just to understand the code and implement the things which were redundant.

 

How to write GenericDao

GenericDao needs to be written using Generics, so that it can support any hibernate entity. There is an interface GenericDao and it’s implementation GenericDaoImpl. All the Dao interface should extend the GenericDao interface and all the dao implementation should extend the GenerficDaoImpl implementation.

package in.pm.hibernate.genericdao_example.dao;

import java.io.Serializable;
import java.util.List;

public interface GenericDao&lt;T, PK extends Serializable&gt; {

    public PK create(T object);
    
    public T find(PK primaryKey);
    
    public List&lt;T&gt; finalAll();
    
    public void update(T object);
    
    public void delete(T object);
}

Following is the implementation for GenericDao. Here hibernate is configured directly with Spring and no JPA is used. But if we want to use the JPA then also the approach remains the same. We get the actual type of class by looking at the type argument of the generic class.

package in.pm.hibernate.genericdao_example.dao.impl;

import in.pm.hibernate.genericdao_example.dao.GenericDao;

import java.io.Serializable;
import java.lang.reflect.ParameterizedType;
import java.util.List;

import org.hibernate.Query;
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.transaction.annotation.Transactional;

public class GenericDaoImpl&lt;T, PK extends Serializable&gt; implements
        GenericDao&lt;T, PK&gt; {

    private Class&lt;T&gt; actualType;

    @Autowired
    private SessionFactory sessionFactory;

    @SuppressWarnings("unchecked")
    public GenericDaoImpl() {

        // get superclass declaration
        ParameterizedType genericSuperClass = (ParameterizedType) getClass()
                .getGenericSuperclass();
        // Find out the actual type
        this.actualType = (Class&lt;T&gt;) genericSuperClass.getActualTypeArguments()[0];

    }

    public Session getSession() {
        return sessionFactory.getCurrentSession();
    }

    @SuppressWarnings("unchecked")
    @Transactional
    public PK create(T object) {
        PK id = (PK) getSession().save(object);
        return id;
    }

    public T find(PK primaryKey) {
        T object = getSession().get(actualType, primaryKey);
        return object;
    }

    @SuppressWarnings("unchecked")
    public List&lt;T&gt; finalAll() {
        String findAllQueryStr = "from " + actualType.getName();
        Query findAllQuery = getSession().createQuery(findAllQueryStr);
        List&lt;T&gt; objects = (List&lt;T&gt;) findAllQuery.list();
        return objects;
    }

    public void update(T object) {
        getSession().update(object);
    }

    public void delete(T object) {
        getSession().delete(object);
    }

}

 

Usage of GenericDao

We will take two entities (Device, User) and try to see how GenericDao is useful in doing db operations for these two entities.

package in.pm.hibernate.genericdao_example.entity;

import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
import javax.persistence.Table;

import org.hibernate.annotations.GenericGenerator;

@Entity
@Table(name = "Device")
public class Device {

    @Id
    @GeneratedValue(generator="increment")
    @GenericGenerator(name= "increment", strategy="increment")
    private Long id;
    
    private String deviceName;
    
    private String deviceType;

    
    public Long getId() {
        return id;
    }

    public void setId(Long id) {
        this.id = id;
    }

    public String getDeviceName() {
        return deviceName;
    }

    public void setDeviceName(String deviceName) {
        this.deviceName = deviceName;
    }

    public String getDeviceType() {
        return deviceType;
    }

    public void setDeviceType(String deviceType) {
        this.deviceType = deviceType;
    }
    
     
}

 

package in.pm.hibernate.genericdao_example.entity;

import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
import javax.persistence.Table;

import org.hibernate.annotations.GenericGenerator;

@Entity
@Table(name = "USER")
public class User {

    @Id
    @GeneratedValue(generator = "increment")
    @GenericGenerator(name = "increment", strategy = "increment")
    private Long id;

    private String firstName;
    private String lastName;
    private String email;
    
    
    public Long getId() {
        return id;
    }
    public void setId(Long id) {
        this.id = id;
    }
    public String getFirstName() {
        return firstName;
    }
    public void setFirstName(String firstName) {
        this.firstName = firstName;
    }
    public String getLastName() {
        return lastName;
    }
    public void setLastName(String lastName) {
        this.lastName = lastName;
    }
    public String getEmail() {
        return email;
    }
    public void setEmail(String email) {
        this.email = email;
    }
    
    
}

 

Dao for Entity

For each entity Dao class needs to be created which will be responsible for doing basic CRUD operation on entity and will support additional query if required.

Dao Interfaces

package in.pm.hibernate.genericdao_example.dao;

import in.pm.hibernate.genericdao_example.entity.Device;

public interface DeviceDao extends GenericDao&lt;Device, Long&gt;{

}
package in.pm.hibernate.genericdao_example.dao;

import in.pm.hibernate.genericdao_example.entity.User;

public interface UserDao extends GenericDao&lt;User, Long&gt; {
    
    public User getUserByName(String firstName);
}

 

Dao Implementation

package in.pm.hibernate.genericdao_example.dao.impl;

import org.springframework.stereotype.Repository;

import in.pm.hibernate.genericdao_example.dao.DeviceDao;
import in.pm.hibernate.genericdao_example.entity.Device;

@Repository
public class DeviceDaoImpl extends GenericDaoImpl&lt;Device, Long&gt; implements DeviceDao{

}
package in.pm.hibernate.genericdao_example.dao.impl;

import org.hibernate.Query;
import org.springframework.stereotype.Repository;

import in.pm.hibernate.genericdao_example.dao.UserDao;
import in.pm.hibernate.genericdao_example.entity.User;

@Repository
public class UserDaoImpl extends GenericDaoImpl&lt;User, Long&gt; implements UserDao{

    public User getUserByName(String firstName) {
        String userByName = "from User where user.firstName = :firstName ";
        Query userByNameQuery = getSession().createQuery(userByName);
        userByNameQuery.setString("firstName", firstName);
        return (User)userByNameQuery.uniqueResult();
    }
    
}

 

As we can see that since we have used generic dao, methods (create, update, delete, find and findAll)  are automatically inherited by both dao (DeviceDao and UserDao).  UserDao had additional requirement for finding user according to its name, so only that method is implemented in the UserDao.

This has ensured code is reused and consistent interface. For e.g. if we want to store any entity, we just have to call create method of that entity’s dao. Similar is the case for update, delete, find and findAll.

 

Test Application

Below is the test class which was used for testing. It creates two entity (User and Device) and inserts it into database.

package in.pm.hibernate.genericdao_example.app;

import in.pm.hibernate.genericdao_example.dao.DeviceDao;
import in.pm.hibernate.genericdao_example.dao.UserDao;
import in.pm.hibernate.genericdao_example.entity.Device;
import in.pm.hibernate.genericdao_example.entity.User;

import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;

public class App {
    
    @SuppressWarnings("resource")
    public static void main(String[] args) {
        
        ApplicationContext context = 
                new AnnotationConfigApplicationContext(AppConfiguration.class);
        
        UserDao userDao = context.getBean(UserDao.class);
        DeviceDao deviceDao = context.getBean(DeviceDao.class);
        
        User user = new User();
        user.setFirstName("Pranav");
        user.setLastName("Maniar");
        user.setEmail("pranav9428@gmail.com");
        
        Device device = new Device();
        device.setDeviceName("Moto G");
        device.setDeviceType("Mobile");
        
        userDao.create(user);
        deviceDao.create(device);
        
    }

}

 

Configure Hibernate5 with Spring4 using java configuration

To configure Hibernate5 with Spring4 using java configuration Spring Configuration class needs to be created. Also one property file containing information about database username/password , connection string, hibernate settings, etc will be required.

Below is the property file which is used during the configuration.

## jdbc configuration 
driverclass = com.mysql.jdbc.Driver
jdbcurl = jdbc:mysql://localhost/db
username = test
password = test

## hibernate configuration
hibernate.dialect = org.hibernate.dialect.MySQLDialect
hibernate.show_sql = true
hibernate.hbm2ddl = create

 

Configuration Class

@Configuration
@ComponentScan(basePackages = { "in.pm.hibernate.genericdao_example" })
@EnableTransactionManagement
public class AppConfiguration {

}

First create a configuration class and annotate it with following annotations

  • @Configuration : Through this annotation Spring will know that this is the java config class
  • @ComponentScan : Specifies the list of base packages which contains Spring beans and hibernate entities. When the spring container starts it will scan these packages and register the beans by reading annotations
  • @EnableTranscationManagement : This will enable spring’s annotation driven transaction management capability

 

 
    @Value("${driverclass}") 
    private String driverClass;
    
    @Value("${jdbcurl}")
    private String jdbcURL;
    
    @Value("${username}")
    private String userName;
    
    @Value("${password}")
    private String password;
    
    @Value("${hibernate.dialect}")
    private String hibernateDialect;
    
    @Value("${hibernate.show_sql}")
    private String hibernateShowSql;
    
    @Value("${hibernate.hbm2ddl}")
    private String hibernateHbm2ddlAuto;

Add properties and annotate it with appropriate value expression, so that value will be taken from property file and bound to the property

 

    @Bean
    public PropertyPlaceholderConfigurer getPropertyPlaceHolderConfigurer() {
        PropertyPlaceholderConfigurer ppc = new PropertyPlaceholderConfigurer();
        ppc.setLocation(new ClassPathResource("application.properties"));
        ppc.setIgnoreUnresolvablePlaceholders(true);
        return ppc;
    }

Create PropertyPlaceHolderConfigurer and provide it with the location of the property file. It will read the properties and populate fields defined in above step

 

    @Bean
    public DataSource getDataSource() {
        DriverManagerDataSource dataSource = new DriverManagerDataSource();
        dataSource.setDriverClassName(driverClass);
        dataSource.setUrl(jdbcURL);
        dataSource.setUsername(userName);
        dataSource.setPassword(password);
        return dataSource;
    }

    public Properties getHibernateProperties() {
        Properties properties = new Properties();
        properties.put("hibernate.dialect", hibernateDialect);
        properties.put("hibernate.show_sql", hibernateShowSql);
        properties.put("hibernate.hbm2ddl.auto", hibernateHbm2ddlAuto);
        return properties;
    }

    //Create a LocalSessionFactoryBean which will be used to create hibernate SessionFactory
    @Bean
    @Autowired
    public LocalSessionFactoryBean getSessionFactory(DataSource dataSource) {
        LocalSessionFactoryBean sfb = new LocalSessionFactoryBean();
        sfb.setDataSource(dataSource);
        sfb.setPackagesToScan("in.pm.hibernate.genericdao_example.entity");
        sfb.setHibernateProperties(getHibernateProperties());
        return sfb;
    }

Now, create datasource and inject this datasource into the function which is used to create SessionFactory.

Create LocalSessionFactoryBean and set the datasource and hibernate properties. Also set the packages to be scanned for hibernate entities. LocalSessionFactoryBean is a Spring FactoryBean which is used to create Hibernate SessionFactory.

 

    @Bean
    @Autowired
    public HibernateTransactionManager transactionManager(SessionFactory sessionFactory) {
        HibernateTransactionManager tm = new HibernateTransactionManager();
        tm.setSessionFactory(sessionFactory);
        return tm;
    }

Finally create HibernateTransactionManager and assigned it the sessionFactory which was created in the previous step.

NOTE: Please use LocalSessionFactoryBean and HibernateTransactionManager which is under package “org.springframework.orm.hibernate5”.

 

Complete configuration class looks as below :

package in.pm.hibernate.genericdao_example.app;

import java.util.Properties;

import javax.sql.DataSource;

import org.hibernate.SessionFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.beans.factory.config.PropertyPlaceholderConfigurer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.ClassPathResource;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import org.springframework.orm.hibernate5.HibernateTransactionManager;
import org.springframework.orm.hibernate5.LocalSessionFactoryBean;
import org.springframework.transaction.annotation.EnableTransactionManagement;

@Configuration
@ComponentScan(basePackages = { "in.pm.hibernate.genericdao_example" })
@EnableTransactionManagement
public class AppConfiguration {

    
    @Value("${driverclass}") 
    private String driverClass;
    
    @Value("${jdbcurl}")
    private String jdbcURL;
    
    @Value("${username}")
    private String userName;
    
    @Value("${password}")
    private String password;
    
    @Value("${hibernate.dialect}")
    private String hibernateDialect;
    
    @Value("${hibernate.show_sql}")
    private String hibernateShowSql;
    
    @Value("${hibernate.hbm2ddl}")
    private String hibernateHbm2ddlAuto;
    
    @Bean
    public PropertyPlaceholderConfigurer getPropertyPlaceHolderConfigurer() {
        PropertyPlaceholderConfigurer ppc = new PropertyPlaceholderConfigurer();
        ppc.setLocation(new ClassPathResource("application.properties"));
        ppc.setIgnoreUnresolvablePlaceholders(true);
        return ppc;
    }

    @Bean
    public DataSource getDataSource() {
        DriverManagerDataSource dataSource = new DriverManagerDataSource();
        dataSource.setDriverClassName(driverClass);
        dataSource.setUrl(jdbcURL);
        dataSource.setUsername(userName);
        dataSource.setPassword(password);
        return dataSource;
    }

    public Properties getHibernateProperties() {
        Properties properties = new Properties();
        properties.put("hibernate.dialect", hibernateDialect);
        properties.put("hibernate.show_sql", hibernateShowSql);
        properties.put("hibernate.hbm2ddl.auto", hibernateHbm2ddlAuto);
        return properties;
    }

    //Create a LocalSessionFactoryBean which will be used to create hibernate SessionFactory
    @Bean
    @Autowired
    public LocalSessionFactoryBean getSessionFactory(DataSource dataSource) {
        LocalSessionFactoryBean sfb = new LocalSessionFactoryBean();
        sfb.setDataSource(dataSource);
        sfb.setPackagesToScan("in.pm.hibernate.genericdao_example.entity");
        sfb.setHibernateProperties(getHibernateProperties());
        return sfb;
    }

    //NOTE: Use HibernateTransactionManager which is under hibernate5 package only.
    @Bean
    @Autowired
    public HibernateTransactionManager transactionManager(SessionFactory sessionFactory) {
        HibernateTransactionManager tm = new HibernateTransactionManager();
        tm.setSessionFactory(sessionFactory);
        return tm;
    }

}