SpringBoot + Mybatis + Druid多數據源集成的心得

最近在作一個項目,須要使用SpringBoot+Mybatis+Druid使用多數據源,前提條件是數據源的個數和名稱不肯定,是在application.properties文件中設定,在使用時根據條件動態切換。css

這樣就不能像Druid官網提到的,經過ConfigurationProperties註解建立多個DruidDataSource,由於這樣屬於硬編碼,添加一個數據源就要再添加代碼,我考慮的是隻使用一套構建DataSource的代碼,添加或刪除數據源只須要修改配置文件。java

Spring提供的AbstractRoutingDataSource提供了運行時動態切換DataSource的功能,可是AbstractRoutingDataSource對象中包含的DataSourceBuilder構建的僅僅是Spring JDBC的DataSource,並非咱們使用的DruidDataSource,須要自行構建。mysql

我參考瞭如下文章git

http://412887952-qq-com.iteye.com/blog/2303075github

這篇文章介紹瞭如何使用AbstractRoutingDataSource構建多數據源,可是它有幾點不足:web

1)構建的TargetDataSources中的DataSource僅包含driverClassName,username,password,url等基本屬性,對於DruidDataSource這種複雜的DataSource,僅賦這些屬性是不夠的。spring

2)構建AbstractingRoutingDataSource使用ImportBeanDefinitionRegistrar進行註冊,不夠直觀。sql

個人方案對這個解決方案作了必定的修改。數據庫

我在本地MySQL新建三個數據庫testdb_1,testdb_2,testdb_3,每一個數據庫新建一張student表apache

建表語句以下

CREATE TABLE `student`  (
  `ID` int(11) NOT NULL AUTO_INCREMENT,
  `NAME` varchar(20) CHARACTER SET utf8 COLLATE utf8_general_ci NOT NULL,
  `CLASS_NAME` varchar(30) CHARACTER SET utf8 COLLATE utf8_general_ci NOT NULL,
  `CREATE_DATE` timestamp(0) NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP(0),
  `UPDATE_DATE` timestamp(0) NOT NULL ON UPDATE CURRENT_TIMESTAMP(0),
  PRIMARY KEY (`ID`) USING BTREE
) ENGINE = InnoDB AUTO_INCREMENT = 0 CHARACTER SET = utf8 COLLATE = utf8_general_ci ROW_FORMAT = Dynamic;

分別建立三個用戶appuser_1,appuser_2,appuser_3,用於鏈接三個數據庫(須要賦予它們訪問數據庫的相應權限)

咱們構建一個名爲SpringBootDruidMultiDB的SpringBoot項目,導入mybatis-spring-boot-starter和spring-boot-starter-web以及spring-boot-starter-test,爲了使用Druid方便,項目還導入druid-spring-boot-starter。因爲使用Log4j2記錄日誌,還添加了log4j2所須要的庫,pom文件的配置以下

<dependencies>
	<dependency>
		<groupId>com.alibaba</groupId>
		<artifactId>druid-spring-boot-starter</artifactId>
		<version>1.1.6</version>
	</dependency>
	<dependency>
		<groupId>org.mybatis.spring.boot</groupId>
		<artifactId>mybatis-spring-boot-starter</artifactId>
		<version>1.3.1</version>
	</dependency>

	<dependency>
		<groupId>org.springframework.boot</groupId>
		<artifactId>spring-boot-starter-aop</artifactId>
	</dependency>

	<dependency>
		<groupId>org.springframework.boot</groupId>
		<artifactId>spring-boot-starter-web</artifactId>
	</dependency>

	<dependency>
		<groupId>org.springframework.boot</groupId>
		<artifactId>spring-boot-starter-test</artifactId>
		<scope>test</scope>
	</dependency>

	<dependency>
		<groupId>org.apache.logging.log4j</groupId>
		<artifactId>log4j-api</artifactId>
		<version>2.10.0</version>
	</dependency>
	<dependency>
		<groupId>org.apache.logging.log4j</groupId>
		<artifactId>log4j-core</artifactId>
		<version>2.10.0</version>
	</dependency>
	<dependency>
		<groupId>com.lmax</groupId>
		<artifactId>disruptor</artifactId>
		<version>3.3.7</version>
	</dependency>
	<dependency>
		<groupId>mysql</groupId>
		<artifactId>mysql-connector-java</artifactId>
		<version>5.1.45</version>
	</dependency>
	<dependency>
		<groupId>com.alibaba</groupId>
		<artifactId>fastjson</artifactId>
		<version>1.2.43</version>
	</dependency>
	<dependency>
		<groupId>org.apache.commons</groupId>
		<artifactId>commons-lang3</artifactId>
		<version>3.7</version>
	</dependency>
	<dependency>
		<groupId>org.apache.commons</groupId>
		<artifactId>commons-collections4</artifactId>
		<version>4.1</version>
	</dependency>
	<dependency>
		<groupId>commons-logging</groupId>
		<artifactId>commons-logging</artifactId>
		<version>1.2</version>
	</dependency>
</dependencies>

修改src/main/resources文件,添加多個數據源信息

spring.custom.datasource.name=db1,db2,db3

spring.custom.datasource.db1.name=db1
spring.custom.datasource.db1.type=com.alibaba.druid.pool.DruidDataSource
spring.custom.datasource.db1.driver-class-name=com.mysql.jdbc.Driver
spring.custom.datasource.db1.url=jdbc:mysql://localhost:3306/testdb_1?characterEncoding=utf8&autoReconnect=true&useSSL=false&useAffectedRows=true
spring.custom.datasource.db1.username=appuser1
spring.custom.datasource.db1.password=admin

spring.custom.datasource.db2.name=db2
spring.custom.datasource.db2.type=com.alibaba.druid.pool.DruidDataSource
spring.custom.datasource.db2.driver-class-name=com.mysql.jdbc.Driver
spring.custom.datasource.db2.url=jdbc:mysql://localhost:3306/testdb_2?characterEncoding=utf8&autoReconnect=true&useSSL=false&useAffectedRows=true
spring.custom.datasource.db2.username=appuser2
spring.custom.datasource.db2.password=admin

spring.custom.datasource.db3.name=db3
spring.custom.datasource.db3.type=com.alibaba.druid.pool.DruidDataSource
spring.custom.datasource.db3.driver-class-name=com.mysql.jdbc.Driver
spring.custom.datasource.db3.url=jdbc:mysql://localhost:3306/testdb_3?characterEncoding=utf8&autoReconnect=true&useSSL=false&useAffectedRows=true
spring.custom.datasource.db3.username=appuser3
spring.custom.datasource.db3.password=admin

再添加DruidDataSource的屬性

spring.datasource.druid.initial-size=5
spring.datasource.druid.min-idle=5
spring.datasource.druid.async-init=true
spring.datasource.druid.async-close-connection-enable=true
spring.datasource.druid.max-active=20
spring.datasource.druid.max-wait=60000
spring.datasource.druid.time-between-eviction-runs-millis=60000
spring.datasource.druid.min-evictable-idle-time-millis=30000
spring.datasource.druid.validation-query=SELECT 1 FROM DUAL
spring.datasource.druid.test-while-idle=true
spring.datasource.druid.test-on-borrow=false
spring.datasource.druid.test-on-return=false
spring.datasource.druid.pool-prepared-statements=true
spring.datasource.druid.max-pool-prepared-statement-per-connection-size=20
spring.datasource.druid.filters=stat,wall,log4j2
spring.datasource.druid.connectionProperties=druid.stat.mergeSql=true;druid.stat.slowSqlMillis=5000

spring.datasource.druid.web-stat-filter.enabled=true
spring.datasource.druid.web-stat-filter.url-pattern=/*
spring.datasource.druid.web-stat-filter.exclusions=*.js,*.gif,*.jpg,*.png,*.css,*.ico,/druid/*
spring.datasource.druid.web-stat-filter.session-stat-enable=true
spring.datasource.druid.web-stat-filter.profile-enable=true


spring.datasource.druid.stat-view-servlet.enabled=true
spring.datasource.druid.stat-view-servlet.url-pattern=/druid/*
spring.datasource.druid.stat-view-servlet.login-username=admin
spring.datasource.druid.stat-view-servlet.login-password=admin
spring.datasource.druid.stat-view-servlet.reset-enable=false
spring.datasource.druid.stat-view-servlet.allow=127.0.0.1

spring.datasource.druid.filter.wall.enabled=true
spring.datasource.druid.filter.wall.db-type=mysql
spring.datasource.druid.filter.wall.config.alter-table-allow=false
spring.datasource.druid.filter.wall.config.truncate-allow=false
spring.datasource.druid.filter.wall.config.drop-table-allow=false

spring.datasource.druid.filter.wall.config.none-base-statement-allow=false
spring.datasource.druid.filter.wall.config.update-where-none-check=true
spring.datasource.druid.filter.wall.config.select-into-outfile-allow=false
spring.datasource.druid.filter.wall.config.metadata-allow=true
spring.datasource.druid.filter.wall.log-violation=true
spring.datasource.druid.filter.wall.throw-exception=true

spring.datasource.druid.filter.stat.log-slow-sql= true
spring.datasource.druid.filter.stat.slow-sql-millis=1000
spring.datasource.druid.filter.stat.merge-sql=true
spring.datasource.druid.filter.stat.db-type=mysql
spring.datasource.druid.filter.stat.enabled=true


spring.datasource.druid.filter.log4j2.enabled=true
spring.datasource.druid.filter.log4j2.connection-log-enabled=true
spring.datasource.druid.filter.log4j2.connection-close-after-log-enabled=true
spring.datasource.druid.filter.log4j2.connection-commit-after-log-enabled=true
spring.datasource.druid.filter.log4j2.connection-connect-after-log-enabled=true
spring.datasource.druid.filter.log4j2.connection-connect-before-log-enabled=true
spring.datasource.druid.filter.log4j2.connection-log-error-enabled=true
spring.datasource.druid.filter.log4j2.data-source-log-enabled=true
spring.datasource.druid.filter.log4j2.result-set-log-enabled=true
spring.datasource.druid.filter.log4j2.statement-log-enabled=true

在src/main/resources目錄下添加log4j2.xml文件

<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="OFF">
    <properties>
        <property name="logPath">./logs/</property>
    </properties>

    <Appenders>
        <Console name="Console" target="SYSTEM_OUT" ignoreExceptions="false">
            <PatternLayout pattern="%d [%t] %-5p %c - %m%n"/>
            <ThresholdFilter level="trace" onMatch="ACCEPT" onMismatch="DENY"/>
        </Console>
        <RollingFile name="infoLog" fileName="${logPath}/multidb_info.log"
                     filePattern="${logPath}/multidb_info-%d{yyyy-MM-dd}.log" append="true" immediateFlush="true">
            <PatternLayout pattern="%d [%t] %-5p %c - %m%n" />
            <TimeBasedTriggeringPolicy />
            <Policies>
                <SizeBasedTriggeringPolicy size="10 MB"/>
            </Policies>
            <DefaultRolloverStrategy max="30"/>
            <Filters>
                <ThresholdFilter level="error" onMatch="DENY" onMismatch="NEUTRAL"/>
                <ThresholdFilter level="trace" onMatch="ACCEPT" onMismatch="DENY"/>
            </Filters>
        </RollingFile>

        <RollingFile name="errorLog" fileName="${logPath}/multidb_error.log"
                     filePattern="${logPath}/multidb_error-%d{yyyy-MM-dd}.log" append="true" immediateFlush="true">
            <PatternLayout pattern="%d [%t] %-5p %c - %m%n" />
            <TimeBasedTriggeringPolicy />
            <Policies>
                <SizeBasedTriggeringPolicy size="10 MB"/>
            </Policies>
            <DefaultRolloverStrategy max="30"/>
            <Filters>
                <ThresholdFilter level="error" onMatch="ACCEPT" onMismatch="DENY"/>
            </Filters>
        </RollingFile>
    </Appenders>
    <Loggers>
        <AsyncLogger name="org.springframework.*" level="INFO"/>
        <AsyncLogger name="com.rick" level="INFO" additivity="false">
            <AppenderRef ref="infoLog" />
            <AppenderRef ref="errorLog" />
            <AppenderRef ref="Console" />
        </AsyncLogger>
        <Root level="INFO">
            <AppenderRef ref="Console"/>
        </Root>
    </Loggers>
</Configuration>

一開始我參照單數據源的構建方式,想像下面的方式構建DruidDataSource數據源

public DataSource createDataSource(Environment environment,
            String prefix)
{
     return DruidDataSourceBuilder.create().build(environment,prefix);
}

這裏prefix是相似於spring.custom.datasource.db1的前綴,然而執行後發現生成的DruidDataSource對象的driverClassName,url,username,password這些基本屬性並無賦值,一旦建立鏈接時就會拋異常。只有使用@ConnectionProperties註解構建的DruidDataSource才能夠正常賦值(相似下面的代碼)。

@Bean("db1")
    @ConfigurationProperties(prefix="spring.custom.datasource.db1.")
    public DataSource dataSource(Environment environment) {   	
    	DruidDataSource ds = DruidDataSourceBuilder.create().build();
    	return ds;
    }

這種方式構建DruidDataSource代碼很簡潔,弊病是硬編碼,構建的DruidDataSource數目固定,不能動態構建,不能知足咱們的須要。上面的文章連接中採用的是下面的方式構建

public DataSource buildDataSource(Map<String, Object> dsMap) {
       Object type = dsMap.get("type");

        if (type == null){
            type = DATASOURCE_TYPE_DEFAULT;// 默認DataSource
        }

        Class<? extends DataSource> dataSourceType;
       try 
       {
             dataSourceType = (Class<? extends DataSource>) Class.forName((String) type);
             String driverClassName = dsMap.get("driverClassName").toString();
             String url = dsMap.get("url").toString();
             String username = dsMap.get("username").toString();
             String password = dsMap.get("password").toString();
            DataSourceBuilder factory =   DataSourceBuilder.create().driverClassName(driverClassName).url(url).username(username).password(password).type(dataSourceType);
            returnfactory.build();

       } catch (ClassNotFoundException e) {
           e.printStackTrace();
       }
       return null;
    }

這種方式可以構建最基本的DataSource,包含了driverClassName,url,username,password屬性,可是像initial-size,maxActive等屬性都沒有賦值,也不能知足要求。

參考上述文章中的作法,以及druid-spring-boot-starter中構建DruidDataSource的代碼,咱們採用下列方式構建包含多個DruidDataSource的DynamicDataSource

咱們首先建立DynamicDataSource和用於存儲當前DataSource信息的ContextHolder

public class DynamicDataSource extends AbstractRoutingDataSource {

    /**
     * 得到數據源
     */
    @Override
    protected Object determineCurrentLookupKey() {
        return DynamicDataSourceContextHolder.getDateSoureType();
    }

}
public class DynamicDataSourceContextHolder {

    /*
     * 使用ThreadLocal維護變量,ThreadLocal爲每一個使用該變量的線程提供獨立的變量副本,
     * 因此每個線程均可以獨立地改變本身的副本,而不會影響其它線程所對應的副本。
     */
    private static final ThreadLocal<String> CONTEXT_HOLDER = new  ThreadLocal<String>();

    /*
     * 管理全部的數據源id,用於數據源的判斷
     */
    public static List<String> datasourceId = new ArrayList<String>();

    /**
     * @Title: setDateSoureType
     * @Description: 設置數據源的變量
     * @param dateSoureType
     * @return void
     * @throws
     */
    public static void setDateSoureType(String dateSoureType){
        CONTEXT_HOLDER.set(dateSoureType);
    }

    /**
     * @Title: getDateSoureType
     * @Description: 得到數據源的變量
     * @return String
     * @throws
     */
    public static String getDateSoureType(){
        return CONTEXT_HOLDER.get();
    }

    /**
     * @Title: clearDateSoureType
     * @Description: 清空全部的數據源變量
     * @return void
     * @throws
     */
    public static void clearDateSoureType(){
        CONTEXT_HOLDER.remove();
    }

    /**
     * @Title: existDateSoure
     * @Description: 判斷數據源是否已存在
     * @param dateSoureType
     * @return boolean
     * @throws
     */
    public static boolean existDateSoure(String dateSoureType ){
        return datasourceId.contains(dateSoureType);
    }
}

我定義了切換DataSource判斷的DataSourceAnnotation,以及用於切換DataSource的AOP Aspect

@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.METHOD, ElementType.TYPE})
@Documented
public @interface DataSourceAnnotation {
}
@Aspect
@Order(-1)
@Component
public class DynamicDataSourceAspect {

    private static final Logger logger = LogManager.getLogger(DynamicDataSourceAspect.class);

    /**
     * 切換數據庫
     * @param point
     * @param dataSourceAnnotation
     * @return
     * @throws Throwable
     */
    @Before("@annotation(dataSourceAnnotation)")
    public void changeDataSource(JoinPoint point, DataSourceAnnotation dataSourceAnnotation){
        Object[] methodArgs = point.getArgs();
        String dsId = methodArgs[methodArgs.length-1].toString();

        if(!DynamicDataSourceContextHolder.existDateSoure(dsId)){
            logger.error("No data source found ...【"+dsId+"】");
            return;
        }else{
            DynamicDataSourceContextHolder.setDateSoureType(dsId);
        }
    }

    /**
     * @Title: destroyDataSource
     * @Description: 銷燬數據源  在全部的方法執行執行完畢後
     * @param point
     * @param dataSourceAnnotation
     * @return void
     * @throws
     */
    @After("@annotation(dataSourceAnnotation)")
    public void destroyDataSource(JoinPoint point,DataSourceAnnotation dataSourceAnnotation){
        DynamicDataSourceContextHolder.clearDateSoureType();
    }
}

連接中的DataSource是對於帶有TargetDataSource註解的方法,根據其value屬性進行數據庫切換(相似下面的Service方法)

/**

     * 指定數據源

     * @return

     */

    @TargetDataSource("ds1")

    public List<Demo> getListByDs1(){

        returntestDao.getListByDs1();

    }

這樣的切換方式很是笨拙,若是數據源有5個,getList方法也要寫5個,加上插入數據和更新數據的方法,Service裏的接口方法就有幾十個,而這些方法裏的實現邏輯徹底同樣,惟獨註解value值不同而已,冗餘的代碼很是可觀。

我採起的方法是在須要切換數據源的方法參數中添加一個數據庫類型的參數dsId,在DynamicDataSourceAspect類的changeDataSource方法中讀取,根據dsId變量的值進行數據源切換,前提是這些Service方法上必須有DataSourceAnnotation註解,方法相似以下

@Service("studentService")
@Transactional
public class StudentServiceImpl implements StudentService {

    @Autowired
    private StudentMapper studentMapper;


    /**
     * 插入學生信息
     *
     * @param employee 學生信息
     * @param dsId 數據源Type
     * @return 插入成功後的學生id
     */
    @Override
    @DataSourceAnnotation
    public int insertEmployee(Student student, String dsId) {
        studentMapper.insertStudent(student);
        return student.getId();
    }

說明一下,這裏的切換數據源方式僅基於個人需求,參考者能夠根據本身的須要進行調整。

接下來編寫Druid的配置類DruidConfig

@Configuration
@EnableTransactionManagement
public class DruidConfig implements EnvironmentAware {

    private List<String> customDataSourceNames = new ArrayList<>();

    private Environment environment;

    /**
     * @param environment the enviroment to set
     */
    @Override
    public void setEnvironment(Environment environment) {
        this.environment = environment;
    }

這裏的Environment是用於讀取application.properties信息的環境對象

構建DynamicDataSource

@Bean(name = "dataSource")
    @Primary
    public AbstractRoutingDataSource dataSource() {
        DynamicDataSource dynamicDataSource = new DynamicDataSource();
        LinkedHashMap<Object, Object> targetDatasources = new LinkedHashMap<>();
        initCustomDataSources(targetDatasources);
        dynamicDataSource.setDefaultTargetDataSource(targetDatasources.get(customDataSourceNames.get(0)));
        dynamicDataSource.setTargetDataSources(targetDatasources);
        dynamicDataSource.afterPropertiesSet();
        return dynamicDataSource;
    }

這裏存儲DynamicDataSource的targetDatasource對象使用LinkedHashMap,是爲了在Druid監控中數據源對象能按照構造的順序顯示。initCustomDataSources方法用於構建每一個DruidDataSource對象,它的代碼以下

private void initCustomDataSources(LinkedHashMap<Object, Object> targetDataResources)
    {
        RelaxedPropertyResolver property =
                new RelaxedPropertyResolver(environment, DATA_SOURCE_PREfIX_CUSTOM);
        String dataSourceNames = property.getProperty(DATA_SOURCE_CUSTOM_NAME);
        if(StringUtils.isEmpty(dataSourceNames))
        {
            logger.error("The multiple data source list are empty.");
        }
        else{
              RelaxedPropertyResolver springDataSourceProperty =
                      new RelaxedPropertyResolver(environment, "spring.datasource.");
              
              Map<String, Object> druidPropertiesMaps = springDataSourceProperty.getSubProperties("druid.");
              Map<String,Object> druidValuesMaps = new HashMap<>();
              for(String key:druidPropertiesMaps.keySet())
              {
                  String druidKey = AppConstants.DRUID_SOURCE_PREFIX + key;
                  druidValuesMaps.put(druidKey,druidPropertiesMaps.get(key));
              }

              MutablePropertyValues dataSourcePropertyValue = new MutablePropertyValues(druidValuesMaps);

              for (String dataSourceName : dataSourceNames.split(SEP)) {
                try {
                    Map<String, Object> dsMaps = property.getSubProperties(dataSourceName+".");

                    for(String dsKey : dsMaps.keySet())
                    {
                        if(dsKey.equals("type"))
                        {
                            dataSourcePropertyValue.addPropertyValue("spring.datasource.type", dsMaps.get(dsKey));
                        }
                        else
                        {
                            String druidKey = DRUID_SOURCE_PREFIX + dsKey;
                            dataSourcePropertyValue.addPropertyValue(druidKey, dsMaps.get(dsKey));
                        }
                    }

                    DataSource ds = dataSourcebuild(dataSourcePropertyValue);
                    if(null != ds){
                        if(ds instanceof DruidDataSource)
                        {
                            DruidDataSource druidDataSource = (DruidDataSource)ds;
                            druidDataSource.setName(dataSourceName);
                            initDruidFilters(druidDataSource);
                        }

                        customDataSourceNames.add(dataSourceName);
                        DynamicDataSourceContextHolder.datasourceId.add(dataSourceName);
                        targetDataResources.put(dataSourceName,ds);

                    }
                    logger.info("Data source initialization 【"+dataSourceName+"】 successfully ...");
                } catch (Exception e) {
                    logger.error("Data source initialization【"+dataSourceName+"】 failed ...", e);
                }
            }
        }
    }

這個方法的執行步驟以下:

1)從application.properties中讀取spring.custom.datasource.name的值(多數據源列表),判斷是否存在多個數據源,以及多個數據源的名稱,若是不存在,就不構建數據源,直接返回。

2)獲取application.properties中以spring.datasource.druid.開頭的屬性值,在這個項目中裏主要是指從spring.datasource.druid.initial-size到spring.datasource.druid.connectionProperties的屬性值,並將它們放入一個MutablePropertyValues對象dataSourcePropertyValue中。在dataSourcePropertyValue中,每一個druid屬性的key仍然和application.properties保持一致。

3)將1)中讀取的多數據源列表以逗號分割,獲取數據源名稱列表,循環這個列表,每一次循環,讀取以spring.custom.datasource.[數據源名稱]前綴開頭的屬性,除了type屬性映射爲spring.datasource.type外,其餘屬性都映射爲spring.datasource.druid. + [屬性名],放入dataSourcePropertyValue中(這裏只考慮了DruidDataSource的狀況,若是構建其餘類型的DataSource,能夠進行調整).

4)執行dataSourcebuild方法構建DataSource,dataSourcebuild方法顯示以下

/**
     * @Title: DataSourcebuild
     * @Description: 建立數據源
     * @param dataSourcePropertyValue 數據源建立所需參數
     *
     * @return DataSource 建立的數據源對象
     */
    public DataSource dataSourcebuild(MutablePropertyValues dataSourcePropertyValue)
    {
        DataSource ds = null;

        if(dataSourcePropertyValue.isEmpty()){
            return ds;
        }

        String type = dataSourcePropertyValue.get("spring.datasource.type").toString();

        if(StringUtils.isNotEmpty(type))
        {
            if(StringUtils.equals(type,DruidDataSource.class.getTypeName()))
            {
                ds = new DruidDataSource();

                RelaxedDataBinder dataBinder = new RelaxedDataBinder(ds, DRUID_SOURCE_PREFIX);
                dataBinder.setConversionService(conversionService);
                dataBinder.setIgnoreInvalidFields(false);
                dataBinder.setIgnoreNestedProperties(false);
                dataBinder.setIgnoreUnknownFields(true);
                dataBinder.bind(dataSourcePropertyValue);
            }
        }
        return ds;
    }

dataSourcebuild方法根據dataSourcePropertyValue中spring.datasource.type的值構建對應的DataSource對象(這裏只考慮了DruidDataSource,若是須要構建其餘數據源,也能夠添加其餘代碼分支),構建完畢後,將構建的DruidDataSource對象與dataSourcePropertyValue以spring.datasource.druid屬性進行綁定,RelaxedDataBinder會自動將dataSourcePropertyValue中的Druid屬性綁定到DruidDataSource對象上(經過RelaxedDataBinder的applyPropertyValues方法)。

並且,DruidDataSource類會根據spring.datasource.druid.filters屬性值,自行構建DruidDataSource的Filters對象列表,參見DruidAbstractDataSource的setFilters方法

public abstract class DruidAbstractDataSource extends WrapperAdapter implements DruidAbstractDataSourceMBean, DataSource, DataSourceProxy, Serializable {

...................
public void setFilters(String filters) throws SQLException {
        if (filters != null && filters.startsWith("!")) {
            filters = filters.substring(1);
            this.clearFilters();
        }
        this.addFilters(filters);
    }

public void addFilters(String filters) throws SQLException {
        if (filters == null || filters.length() == 0) {
            return;
        }

        String[] filterArray = filters.split("\\,");

        for (String item : filterArray) {
            FilterManager.loadFilter(this.filters, item.trim());
        }
    }
public class FilterManager {

.........
public static void loadFilter(List<Filter> filters, String filterName) throws SQLException {
        if (filterName.length() == 0) {
            return;
        }

        String filterClassNames = getFilter(filterName);

        if (filterClassNames != null) {
            for (String filterClassName : filterClassNames.split(",")) {
                if (existsFilter(filters, filterClassName)) {
                    continue;
                }

                Class<?> filterClass = Utils.loadClass(filterClassName);

                if (filterClass == null) {
                    LOG.error("load filter error, filter not found : " + filterClassName);
                    continue;
                }

                Filter filter;

                try {
                    filter = (Filter) filterClass.newInstance();
                } catch (ClassCastException e) {
                    LOG.error("load filter error.", e);
                    continue;
                } catch (InstantiationException e) {
                    throw new SQLException("load managed jdbc driver event listener error. " + filterName, e);
                } catch (IllegalAccessException e) {
                    throw new SQLException("load managed jdbc driver event listener error. " + filterName, e);
                }

                filters.add(filter);
            }

            return;
        }

        if (existsFilter(filters, filterName)) {
            return;
        }

        Class<?> filterClass = Utils.loadClass(filterName);
        if (filterClass == null) {
            LOG.error("load filter error, filter not found : " + filterName);
            return;
        }

        try {
            Filter filter = (Filter) filterClass.newInstance();
            filters.add(filter);
        } catch (Exception e) {
            throw new SQLException("load managed jdbc driver event listener error. " + filterName, e);
        }
    }

在DruidDataSource對象屬性賦值完畢後,DruidDataSource對象的filters屬性將擁有與spring.datasource.druid.filters相對應的Filter對象列表,在本項目filters列表中包含StatFilter對象,WallFilter對象,Log4j2Filter對象,可是這些對象並無與application.properties中的屬性相綁定,還須要咱們手動綁定。

5)dataSourcebuild返回構建的DruidDataSource對象後,initCustomDataSources方法設置DruidDataSource的name屬性爲數據源名稱(這是爲了在Druid監控的數據源頁面中便於區分),調用initDruidFilters方法爲4)中構建的DruidDataSource對象的filters賦值,initDruidFilters方法代碼以下:

private void initDruidFilters(DruidDataSource druidDataSource){

        List<Filter> filters = druidDataSource.getProxyFilters();

        RelaxedPropertyResolver filterProperty =
                new RelaxedPropertyResolver(environment, "spring.datasource.druid.filter.");


        String filterNames= environment.getProperty("spring.datasource.druid.filters");

        String[] filterNameArray = filterNames.split("\\,");

        for(int i=0; i<filterNameArray.length;i++){
            String filterName = filterNameArray[i];
            Filter filter = filters.get(i);

            Map<String, Object> filterValueMap = filterProperty.getSubProperties(filterName + ".");
            String statFilterEnabled = filterValueMap.get(ENABLED_ATTRIBUTE_NAME).toString();
            if(statFilterEnabled.equals("true")){
                MutablePropertyValues propertyValues = new  MutablePropertyValues(filterValueMap);
                RelaxedDataBinder dataBinder = new RelaxedDataBinder(filter);
                dataBinder.bind(propertyValues);
            }
        }
    }

initDruidFilters方法根據spring.datasource.druid.filters屬性,獲取filter名稱列表filterNames,並循環這個列表,讀取每一個Filter對應的屬性值,構建MutablePropertyValues,將其與對應的Filter對象綁定,實現賦值。

6)最後將構建的DruidDataSource對象放入targetDataResources,將數據源名放入DynamicDataSourceContextHolder的DsId列表,用於切換數據源時檢查。

initCustomDataSources構建數據源完畢後,dataSource()方法將使用第一個DruidDataSource爲默認DataSource,完成DynamicDataSource的構建。

對於StatViewServlet和Web Stat Filter對象,不須要手動構建,druid-spring-boot-starter中DruidWebStatFilterConfiguration和DruidStatViewServletConfiguration會自動掃描對應前綴進行構建,只須要在application.properties中將spring.datasource.druid.web-stat-filter.enabled和spring.datasource.druid.stat-view-servlet.enabled兩項設置爲true便可。

添加完Druid的配置類後,再添加SessionFactory的配置類SessionFactoryConfig

@Configuration
@EnableTransactionManagement
@MapperScan("com.rick.mappers")
public class SessionFactoryConfig {

    private static Logger logger = LogManager.getLogger(SessionFactoryConfig.class);

    @Autowired
    private DataSource dataSource;

    private String typeAliasPackage = "com.rick.entities";

    /**
     *建立sqlSessionFactoryBean 實例
     * 而且設置configtion 如駝峯命名.等等
     * 設置mapper 映射路徑
     * 設置datasource數據源
     * @return
     */
    @Bean(name = "sqlSessionFactory")
    public SqlSessionFactoryBean createSqlSessionFactoryBean() {
        logger.info("createSqlSessionFactoryBean method");

        try{
            ResourcePatternResolver resolver = new PathMatchingResourcePatternResolver();
            SqlSessionFactoryBean sqlSessionFactoryBean = new SqlSessionFactoryBean();
            sqlSessionFactoryBean.setDataSource(dataSource);            
            sqlSessionFactoryBean.setMapperLocations(resolver.getResources("classpath:com/rick/mappers/*.xml"));
            sqlSessionFactoryBean.setTypeAliasesPackage(typeAliasPackage);
            return sqlSessionFactoryBean;
        }
        catch(IOException ex){
            logger.error("Error happens when getting config files." + ExceptionUtils.getMessage(ex));
        }
        return null;
    }

    @Bean
    public SqlSessionTemplate sqlSessionTemplate(SqlSessionFactory sqlSessionFactory) {
        return new SqlSessionTemplate(sqlSessionFactory);
    }

    @Bean
    public PlatformTransactionManager annotationDrivenTransactionManager() {
        return new DataSourceTransactionManager(dataSource);
    }
}

SessionFactoryConfig的配置與單數據源的SessionFactory配置基本一致,只不過使用DynamicDataSource替換單數據源。

爲了測試集成的多數據源,我寫了一個插入Student數據的Service和Rest API接口,StudentService的代碼以下:

public interface StudentService {

    /**
     * 插入學生信息
     * @param student 學生信息
     * @param dsId 數據源Id
     * @return 插入成功後的學生id
     */
    int insertStudent(Student student, String dsId);



    /**
     * 根據學生id查找學生信息
     * @param id 學生id
     * @param dsId 數據源Id
     * @return 若是學生存在,返回學生對象,反之返回null
     */
    Student findStudentById(int id, String dsId);

}

@Service("studentService")
@Transactional
public class StudentServiceImpl implements StudentService {

	@Autowired
    private StudentMapper studentMapper;

	/**
	 * 插入學生信息
	 * @param student 學生信息
	 * @param dsId 數據源Id
	 * @return 插入成功後的學生id
	 */
	@Override
	@DataSourceAnnotation
	public int insertStudent(Student student, String dsId) {
		 studentMapper.insertStudent(student);
	     return student.getId();
	}

	/**
	 * 根據學生id查找學生信息
	 * @param id 學生id
	 * @param dsId 數據源Id
	 * @return 若是學生存在,返回學生對象,反之返回null
	 */
	@Override
	@DataSourceAnnotation
	public Student findStudentById(int id, String dsId) {
		return studentMapper.findStudentById(id);
	}   
}

Rest API的接口代碼以下:

@RestController
public class ApiController {

    @Autowired
    private StudentService studentService;


    /**
     * 添加學生並檢查
     * @param params 添加的學生信息參數
     * @return 添加學生後檢查的結果,若是成功返回Success和添加學生信息,失敗返回Fail和失敗信息。
     */
    @RequestMapping(value = "/api/addStudent", produces = "application/json", method = RequestMethod.POST)
    public ResponseEntity<String> addStudent(@RequestBody String params)
    {
        ResponseEntity<String> response = null;
        HttpHeaders headers = new HttpHeaders();
        headers.setContentType(MediaType.APPLICATION_JSON_UTF8);

        Map<String, Object> resultMap = new HashMap<>();
        try{

            JSONObject paramJsonObj = JSON.parseObject(params);

            Student student = new Student();
            student.setName(paramJsonObj.getString(NAME_ATTRIBUTE_NAME));
            student.setClassName(paramJsonObj.getString(CLASS_ATTRIBUTE_NAME));
            String dbType = paramJsonObj.getString(DB_TYPE);
            if(StringUtils.isEmpty(dbType))
            {
                dbType = "db1";
            }

            int id = studentService.insertStudent(student,dbType);

            if(id <= 0)
            {
                String errorMessage = "Error happens when insert student, the insert operation failed.";
                logger.error(errorMessage);
                resultMap = new HashMap<>();
                resultMap.put(RESULT_CONSTANTS, FAIL_RESULT);
                resultMap.put(MESSAGE_CONSTANTS, errorMessage);
                String resultStr = JSON.toJSONString(resultMap);

                return new ResponseEntity<>(resultStr,headers, HttpStatus.INTERNAL_SERVER_ERROR);
            }
            else
            {
                resultMap.put(RESULT_CONSTANTS, SUCCESS_RESULT);

                Student newStudent = studentService.findStudentById(id,dbType);

                Map<String, String> studentMap = new HashMap<>();
                studentMap.put(ID_ATTRIBUTE_NAME,Integer.toString(id));
                studentMap.put(NAME_ATTRIBUTE_NAME,newStudent.getName());
                studentMap.put(CLASS_ATTRIBUTE_NAME,newStudent.getClassName());
                studentMap.put(CRAETE_DATE_ATTRIBUTE_NAME, sdf.format(newStudent.getCreateDate()));
                studentMap.put(UPDATE_DATE_ATTRIBUTE_NAME, sdf.format(newStudent.getUpdateDate()));
                resultMap.put(STUDENT_NODE_NAME,studentMap);

                String resultStr = JSON.toJSONString(resultMap);
                response = new ResponseEntity<>(resultStr,headers, HttpStatus.OK);
            }
        }
        catch(Exception ex)
        {
            response = processException("insert student", ex, headers);
        }
        return response;
    }

addStudent方法經過POST請求Body中的dbType值確認須要切換的數據源(實際狀況下能夠根據請求數據信息判斷該切換到哪個數據源,這個方法先插入一條學生數據,再根據插入後生成的id查找插入的學生信息。

添加對應的Mapper接口StudentMapper和studentMapper.xml文件

@Mapper
public interface StudentMapper {

    Student findStudentById(int id);

    int insertStudent(Student student);    
}
<!DOCTYPE mapper
        PUBLIC "-//mybatis.org//DTD Mapper 3.0//EN"
        "http://mybatis.org/dtd/mybatis-3-mapper.dtd">

<mapper namespace="com.rick.mappers.StudentMapper">
    <resultMap id="studentResultMap" type="com.rick.entities.Student">
        <id column="ID" property="id" />
        <result column="NAME" property="name" />
        <result column="CLASS_NAME" property="className" />
        <result column="CREATE_DATE" property="createDate" />
        <result column="UPDATE_DATE" property="updateDate" />
    </resultMap>
    <select id="findStudentById" resultMap="studentResultMap" resultType="map">
        select ID,NAME,CLASS_NAME,CREATE_DATE,UPDATE_DATE from student WHERE ID=#{id}
    </select>

      <insert id="insertStudent" parameterType="com.rick.entities.Student" useGeneratedKeys="true" keyProperty="id">
        insert into student(NAME,CLASS_NAME,CREATE_DATE,UPDATE_DATE) values(#{name},#{className},CURRENT_TIMESTAMP(),CURRENT_TIMESTAMP())
    </insert>

</mapper>

修改pom.xml文件,在編譯時包含mapper.xml文件

<build>
   ......
    <resources>
        <resource>
                <directory>src/main/java</directory>
                <includes>
                    <include>**/*.xml</include>
                </includes>
        </resource>
         ......
    </resources>
</build>

啓動SpringBoot項目,從啓動日誌中咱們能夠看到db1,db2,db3 DataSource都成功構建

打開Druid監控頁面(http://localhost:8080/druid),在數據源頁面中沒有任何數據源

這是由於Druid監控的數據源信息是根據源自MBean Server的數據源信息,而這個註冊過程是在第一次使用數據源,獲取Jdbc Connection鏈接執行的,參見下面的代碼

public class DruidDataSource extends DruidAbstractDataSource implements DruidDataSourceMBean, ManagedDataSource, Referenceable, Closeable, Cloneable, ConnectionPoolDataSource, MBeanRegistration {

 ..........

public DruidPooledConnection getConnection(long maxWaitMillis) throws SQLException {
        init();

        .......
    }

 public void init() throws SQLException {
   .........
   registerMbean();
   .........
}

public void registerMbean() {
        if (!mbeanRegistered) {
            AccessController.doPrivileged(new PrivilegedAction<Object>() {

                @Override
                public Object run() {
                    ObjectName objectName = DruidDataSourceStatManager.addDataSource(DruidDataSource.this,
                                                                                     DruidDataSource.this.name);

                    DruidDataSource.this.setObjectName(objectName);
                    DruidDataSource.this.mbeanRegistered = true;

                    return null;
                }
            });
        }
    }

因爲SpringBoot程序執行時並無任何數據庫訪問,沒有Connection建立,所以沒有任何數據源被註冊,從數據源監控界面看不到任何數據源。

咱們執行三次學生信息插入操做,分別使用不一樣的數據源

第二次和第三次操做的請求報文與第一次相似,學生名分別爲Student2和Student3,班級名分別爲Class2和Class3,dbType分別爲db2和db3,執行後的結果以下

第二次和第三次操做執行的返回結果相似。

這時咱們再查看Druid數據源監控界面,能夠看到三個數據源都已經被註冊

 

 

數據源tab上的名稱是咱們設置的DruidDataSource對象的name屬性值。

隨便點開一個數據源的[sql列表]行的[View]按鈕,咱們能夠看到API接口調用後的sql語句

點開[SQL防火牆],能夠看到對三個數據源的統一統計

 

全部的項目代碼已經上傳到Github,代碼地址爲

https://github.com/yosaku01/SpringBootDruidMultiDB

對博文有不清楚的讀者能夠自行下載代碼查看。

另外,本項目僅僅是SpringBoot + Mybatis + Druid多數據源集成的一個簡單Demo,可能有些地方理解不許確,有些作法不正確,若是讀者有意見,也能夠指出,做者感激涕零。

相關文章
相關標籤/搜索