• 周六. 10 月 5th, 2024

5G编程聚合网

5G时代下一个聚合的编程学习网

热门标签

Is it difficult to separate reading from writing? Springboot combined with AOP is simple to implemen

King Wang

1 月 3, 2022

List of articles

      • Preface
      • The deployment environment
      • Start the project
        • Directory structure
        • Build table
        • Master and slave data source configuration
        • Set the routing
        • Data source comments
        • aop Switch data source
      • Be careful
      • Refer to :

Preface

It’s been a month since I joined the new company , Finished the work at hand , A few days ago, I finally had time to study the code of the company’s old projects . In the process of studying code , I found that… Was used in the project Spring Aop To realize the separation of reading and writing of database , In line with their love of learning ( I don’t believe it myself …) The character of the , Decided to write an example project to realize spring aop The effect of separation of reading and writing .

The deployment environment

database :MySql

Number of Libraries :2 individual , A master from

About mysql The master-slave environment of has been described in the article before deployment , No more details here , Reference resources 《 Hand to hand , How to be in windows System building mysql Master slave replication environment 》

Start the project

First , without doubt , Start building a SpringBoot engineering , And then in pom The following dependencies are introduced into the file :

<dependencies>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>druid-spring-boot-starter</artifactId>
<version>1.1.10</version>
</dependency>
<dependency>
<groupId>org.mybatis.spring.boot</groupId>
<artifactId>mybatis-spring-boot-starter</artifactId>
<version>1.3.2</version>
</dependency>
<dependency>
<groupId>tk.mybatis</groupId>
<artifactId>mapper-spring-boot-starter</artifactId>
<version>2.1.5</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>8.0.16</version>
</dependency>
<!-- Dynamic data sources The required depend on ### start-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jdbc</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-aop</artifactId>
<scope>provided</scope>
</dependency>
<!-- Dynamic data sources The required depend on ### end-->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.4</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
</dependencies>

Directory structure

After introducing basic dependencies , Sort out the directory structure , The skeleton of the completed project is as follows :
 Insert picture description here

Build table

Create a table user, Execute… In the main library sql Statement is generating the corresponding table data from the database

DROP TABLE IF EXISTS `user`;
CREATE TABLE `user` (
`user_id` bigint(20) NOT NULL COMMENT ' user id',
`user_name` varchar(255) DEFAULT '' COMMENT ' User name ',
`user_phone` varchar(50) DEFAULT '' COMMENT ' User's mobile phone ',
`address` varchar(255) DEFAULT '' COMMENT ' address ',
`weight` int(3) NOT NULL DEFAULT '1' COMMENT ' The weight , The bigger is the better ',
`created_at` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT ' Creation time ',
`updated_at` datetime DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP COMMENT ' Update time ',
PRIMARY KEY (`user_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
INSERT INTO `user` VALUES ('1196978513958141952', ' test 1', '18826334748', ' Haizhu District, Guangzhou ', '1', '2019-11-20 10:28:51', '2019-11-22 14:28:26');
INSERT INTO `user` VALUES ('1196978513958141953', ' test 2', '18826274230', ' Tianhe District, Guangzhou ', '2', '2019-11-20 10:29:37', '2019-11-22 14:28:14');
INSERT INTO `user` VALUES ('1196978513958141954', ' test 3', '18826273900', ' Tianhe District, Guangzhou ', '1', '2019-11-20 10:30:19', '2019-11-22 14:28:30');

Master and slave data source configuration

application.yml, The main information is the data source configuration of the master-slave database

server:
port: 8001
spring:
jackson:
date-format: yyyy-MM-dd HH:mm:ss
time-zone: GMT+8
datasource:
type: com.alibaba.druid.pool.DruidDataSource
driver-class-name: com.mysql.cj.jdbc.Driver
master:
url: jdbc:mysql://127.0.0.1:3307/user?serverTimezone=Asia/Shanghai&useUnicode=true&characterEncoding=UTF-8&autoReconnect=true&failOverReadOnly=false&useSSL=false&zeroDateTimeBehavior=convertToNull&allowMultiQueries=true
username: root
password:
slave:
url: jdbc:mysql://127.0.0.1:3308/user?serverTimezone=Asia/Shanghai&useUnicode=true&characterEncoding=UTF-8&autoReconnect=true&failOverReadOnly=false&useSSL=false&zeroDateTimeBehavior=convertToNull&allowMultiQueries=true
username: root
password:

Because there are one master and one slave data sources , Let’s use enumeration classes instead of , It is convenient for us to use

@Getter
public enum DynamicDataSourceEnum {
MASTER("master"),
SLAVE("slave");
private String dataSourceName;
DynamicDataSourceEnum(String dataSourceName) {
this.dataSourceName = dataSourceName;
}
}

Data source configuration information class DataSourceConfig, There are two data sources configured here ,masterDb and slaveDb

@Configuration
@MapperScan(basePackages = "com.xjt.proxy.mapper", sqlSessionTemplateRef = "sqlTemplate")
public class DataSourceConfig {
// Main library
@Bean
@ConfigurationProperties(prefix = "spring.datasource.master")
public DataSource masterDb() {
return DruidDataSourceBuilder.create().build();
}
/**
* Slave Library
*/
@Bean
@ConditionalOnProperty(prefix = "spring.datasource", name = "slave", matchIfMissing = true)
@ConfigurationProperties(prefix = "spring.datasource.slave")
public DataSource slaveDb() {
return DruidDataSourceBuilder.create().build();
}
/**
* Master slave dynamic configuration
*/
@Bean
public DynamicDataSource dynamicDb(@Qualifier("masterDb") DataSource masterDataSource,
@Autowired(required = false) @Qualifier("slaveDb") DataSource slaveDataSource) {
DynamicDataSource dynamicDataSource = new DynamicDataSource();
Map<Object, Object> targetDataSources = new HashMap<>();
targetDataSources.put(DynamicDataSourceEnum.MASTER.getDataSourceName(), masterDataSource);
if (slaveDataSource != null) {
targetDataSources.put(DynamicDataSourceEnum.SLAVE.getDataSourceName(), slaveDataSource);
}
dynamicDataSource.setTargetDataSources(targetDataSources);
dynamicDataSource.setDefaultTargetDataSource(masterDataSource);
return dynamicDataSource;
}
@Bean
public SqlSessionFactory sessionFactory(@Qualifier("dynamicDb") DataSource dynamicDataSource) throws Exception {
SqlSessionFactoryBean bean = new SqlSessionFactoryBean();
bean.setMapperLocations(
new PathMatchingResourcePatternResolver().getResources("classpath*:mapper/*Mapper.xml"));
bean.setDataSource(dynamicDataSource);
return bean.getObject();
}
@Bean
public SqlSessionTemplate sqlTemplate(@Qualifier("sessionFactory") SqlSessionFactory sqlSessionFactory) {
return new SqlSessionTemplate(sqlSessionFactory);
}
@Bean(name = "dataSourceTx")
public DataSourceTransactionManager dataSourceTx(@Qualifier("dynamicDb") DataSource dynamicDataSource) {
DataSourceTransactionManager dataSourceTransactionManager = new DataSourceTransactionManager();
dataSourceTransactionManager.setDataSource(dynamicDataSource);
return dataSourceTransactionManager;
}
}

Set the routing

The purpose of setting the route is to find the corresponding data source conveniently , We can use ThreadLocal Save the data source information to each thread , It’s convenient for us to get

public class DataSourceContextHolder {
private static final ThreadLocal<String> DYNAMIC_DATASOURCE_CONTEXT = new ThreadLocal<>();
public static void set(String datasourceType) {
DYNAMIC_DATASOURCE_CONTEXT.set(datasourceType);
}
public static String get() {
return DYNAMIC_DATASOURCE_CONTEXT.get();
}
public static void clear() {
DYNAMIC_DATASOURCE_CONTEXT.remove();
}
}

Get route

public class DynamicDataSource extends AbstractRoutingDataSource {
@Override
protected Object determineCurrentLookupKey() {
return DataSourceContextHolder.get();
}
}

AbstractRoutingDataSource It’s based on finding key Route to the corresponding data source , It internally maintains a set of target data sources , And routing key Mapping to the target data source , Offer based on key How to find the data source .

Data source comments

For the convenience of switching data sources , We can write an annotation , The annotation contains the enumeration values corresponding to the data source , The default is the main database ,

@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.METHOD)
@Documented
public @interface DataSourceSelector {
DynamicDataSourceEnum value() default DynamicDataSourceEnum.MASTER;
boolean clear() default true;
}

aop Switch data source

Come here ,aop Finally I can show up , So let’s define one here aop class , Switch data sources for annotated methods , The specific code is as follows :

@Slf4j
@Aspect
@Order(value = 1)
@Component
public class DataSourceContextAop {
@Around("@annotation(com.xjt.proxy.dynamicdatasource.DataSourceSelector)")
public Object setDynamicDataSource(ProceedingJoinPoint pjp) throws Throwable {
boolean clear = true;
try {
Method method = this.getMethod(pjp);
DataSourceSelector dataSourceImport = method.getAnnotation(DataSourceSelector.class);
clear = dataSourceImport.clear();
DataSourceContextHolder.set(dataSourceImport.value().getDataSourceName());
log.info("======== Switch the data source to :{}", dataSourceImport.value().getDataSourceName());
return pjp.proceed();
} finally {
if (clear) {
DataSourceContextHolder.clear();
}
}
}
private Method getMethod(JoinPoint pjp) {
MethodSignature signature = (MethodSignature)pjp.getSignature();
return signature.getMethod();
}
}

To this step , Our preparation and configuration work is finished , Let’s start to test the effect .

Write well first Service file , There are two ways to read and update ,

@Service
public class UserService {
@Autowired
private UserMapper userMapper;
@DataSourceSelector(value = DynamicDataSourceEnum.SLAVE)
public List<User> listUser() {
List<User> users = userMapper.selectAll();
return users;
}
@DataSourceSelector(value = DynamicDataSourceEnum.MASTER)
public int update() {
User user = new User();
user.setUserId(Long.parseLong("1196978513958141952"));
user.setUserName(" Revised name 2");
return userMapper.updateByPrimaryKeySelective(user);
}
@DataSourceSelector(value = DynamicDataSourceEnum.SLAVE)
public User find() {
User user = new User();
user.setUserId(Long.parseLong("1196978513958141952"));
return userMapper.selectByPrimaryKey(user);
}
}

As can be seen from the method notes , The way of reading is from the library , Update the method to go to the main library , The object of the update is userId by 1196978513958141953 The data of ,

Then we write a test class to test whether it can achieve the effect ,

@RunWith(SpringRunner.class)
@SpringBootTest
class UserServiceTest {
@Autowired
UserService userService;
@Test
void listUser() {
List<User> users = userService.listUser();
for (User user : users) {
System.out.println(user.getUserId());
System.out.println(user.getUserName());
System.out.println(user.getUserPhone());
}
}
@Test
void update() {
userService.update();
User user = userService.find();
System.out.println(user.getUserName());
}
}

test result :

1、 Reading method
 Insert picture description here
2、 Update method
 Insert picture description here
After performing , Compare the database to find that the master and slave databases have modified the data , It shows that our separation of reading and writing is successful . Of course , The update method can point to from the library , In this way, only the data from the database will be modified , It doesn’t involve the main database .

Be careful

The above test example is simple , But it also conforms to the normal read-write separation configuration . It’s worth noting that , The purpose of the separation of reading and writing is to ease the writing library , That is, the pressure of the main reservoir , But it must be based on the principle of data consistency , It is to ensure that the data between the master and slave databases must be consistent . If a method involves writing logic , Then all database operations in this method should go to the main database .

Suppose that the data may not be synchronized to the slave database after the write operation , And then the read operation begins to execute , If the read program is still running from the library , Then there will be data inconsistency , This is what we don’t allow .

Finally, I will send you the github Address , Interested students can have a look , Remember to give star Oh

Address :https://github.com/Taoxj/mysql-proxy

Refer to :

https://www.cnblogs.com/cjsblog/p/9712457.html

发表回复