diff --git a/README.md b/README.md index fd16dc10..66853254 100644 --- a/README.md +++ b/README.md @@ -1,13 +1,11 @@ [![License](https://img.shields.io/badge/license-Apache%202-4EB1BA.svg)](https://www.apache.org/licenses/LICENSE-2.0.html) -English | [中文](/README_zh.md) +English | [中文](docs/zh_CN/ch1/README.md) ## Overview Qualitis is a data quality management platform that supports quality verification, notification, and management for various datasource. It is used to solve various data quality problems caused by data processing. -Based on Spring Boot, Qualitis submits quality model task to [Linkis](https://github.com/apache/incubator-linkis) platform. It provides functions such as data quality model construction, data quality model execution, data quality verification, reports of data quality generation and so on. - -At the same time, Qualitis provides enterprise-level features of financial-level resource isolation, management and access control. It is also guaranteed working well under high-concurrency, high-performance and high-availability scenarios. +Based on Spring Boot, Qualitis submits quality model task to [Linkis](https://github.com/WeBankFinTech/Linkis) platform. It provides functions such as data quality model construction, data quality model execution, data quality verification, reports of data quality generation and so on. At the same time, Qualitis provides enterprise-level features of financial-level resource isolation, management and access control. It is also guaranteed working well under high-concurrency, high-performance and high-availability scenarios. ## Features - **Define Data Quality Model** @@ -34,14 +32,13 @@ Supports workflow Workflow needs [DataSphereStudio](https://github.com/WeBankFinTech/DataSphereStudio). - **Administrator Console** -Administrator console provided. -And it also supports personnel management, access control management, privilege control management, metadata management and so on. +Administrator console provided. And it also supports personnel management, access control management, privilege control management, metadata management and so on. ## Compared with similar systems ![](images/en_US/ch1/CompareSimilarSystem.png) ## Documents -[Quick Deploy](docs/en_US/ch1/QuickDeploy.md) +[Quick Deploy](docs/en_US/ch1/Quick%20Deploy%20Standalone.md) [User Manual](docs/en_US/ch1/User%20Manual.md) [Architecture Design](docs/en_US/ch1/Architecture%20Design.md)
@@ -67,6 +64,11 @@ Supports generating data quality reports with optional latitude. ### 4. Support intelligent discovery of data quality problems
+**If you have any needs, please send us an issue and we will reply to you in time.** + +## Contributing +Community partners are very welcome to contribute new engines and codes to us! + ## Communication If you desire immediate response, please kindly raise issues to us or scan the below QR code by WeChat and QQ to join our group: ![](images/en_US/ch1/ContractUs.png) @@ -74,5 +76,3 @@ If you desire immediate response, please kindly raise issues to us or scan the b ## License **Qualitis is under the Apache 2.0 license. See the [LICENSE](/LICENSE) file for details.** -## Tips -The front-end code of Qualitis adopts the front-end framework FES self-developed by WeBank. The FES framework is currently open source, and the source code can be downloaded through the ui folder in the root directory. Of course, it can also be used directly by downloading the release. The default front-end compilation package is installed in Qualitis-x.x.x/conf/static. \ No newline at end of file diff --git a/README_zh.md b/README_zh.md deleted file mode 100644 index 0f58890a..00000000 --- a/README_zh.md +++ /dev/null @@ -1,80 +0,0 @@ -[![License](https://img.shields.io/badge/license-Apache%202-4EB1BA.svg)](https://www.apache.org/licenses/LICENSE-2.0.html) - -[English](/README.md) | 中文 - -## 引言 -Qualitis是一个支持多种异构数据源的质量校验、通知、管理服务的数据质量管理平台,用于解决业务系统运行、数据中心建设及数据治理过程中的各种数据质量问题。 - -Qualitis基于Spring Boot,依赖于Linkis进行数据计算,提供数据质量模型构建,数据质量模型执行,数据质量任务管理,异常数据发现保存以及数据质量报表生成等功能。并提供了金融级数据质量模型资源隔离,资源管控,权限隔离等企业特性,具备高并发,高性能,高可用的大数据质量管理能力。 - -## 核心特点 -- **数据质量模型定义** -支持以下数据模型定义: -1.单表校验数据模型。 -2.跨表校验数据模型。 -3.自定义校验数据模型。 -
同时,系统预置了多个数据质量校验模版,包括空值校验,枚举校验等常用校验,并且支持自定义数据质量模版。 - -- **数据质量模型调度** -支持数据质量模型调度。 - -- **数据质量报表** -支持生成数据质量报表。 - -- **日志管理** -日志聚合管理,方便排查数据质量任务 - -- **异常数据管理** -支持异常数据提取和存储,快速定位问题 - -- **支持工作流** -支持在工作流当中进行数据质量校验 -工作流必装[DataSphereStudio](https://github.com/WeBankFinTech/DataSphereStudio). - -- **管理员控制台** -提供管理员控制台界面,支持人员管理,权限管理,权限管理,元数据管理等管理功能。 - -## 与类似系统对比 -![](/images/zh_CN/ch1/相似系统对比图.png) - -## 文档列表 -[快速搭建手册](/docs/zh_CN/ch1/快速搭建手册——单机版.md) -[架构设计文档](/docs/zh_CN/ch1/架构设计文档.md) -[用户手册](/docs/zh_CN/ch1/用户手册.md) -[升级指南](/docs/zh_CN/ch1/升级指南.md) -
- -## Architecture -![](/images/zh_CN/ch1/总体架构设计.png) - -## Road Map -### 1. 支持对报表数据进行数据质量校验 - -- 支持在工作流当中,对生成的报表数据进行数据质量校验 - -### 2. 支持多种数据源的数据质量校验 - -- 支持HDFS, HIVE, MySQL等数据源间的数据质量校验 -- 支持实时数据的数据质量校验,如Kafka - -### 3. 支持生成可选纬度的数据质量报表 - -- 支持可选纬度生成数据质量报表 - -### 4. 支持智能发现数据质量问题 -
- -**如果您有任何需求,欢迎给我们提issue,我们将会及时给您回复。** - -## Contributing -非常欢迎广大的社区伙伴给我们贡献新引擎和代码! - -## Communication -如果您想得到最快的响应,请给我们提issue,或者您也可以扫码进群: -![](/images/en_US/ch1/ContractUs.png) - -## License -**Qualitis is under the Apache 2.0 license. See the [LICENSE](/LICENSE) file for details.** - -## Tips -Qualitis的前端代码是采用WeBank自研的前端框架FES,FES框架目前已开源,可以通过根目录的 ui 文件夹下进行源码下载,当然也可以通过下载 release 直接启动使用,默认的前端编译包安装在 Qualitis-x.x.x/conf/static 下。 diff --git a/appconn/pom.xml b/appconn/pom.xml index 96ae3f14..50ddd9e2 100644 --- a/appconn/pom.xml +++ b/appconn/pom.xml @@ -79,7 +79,7 @@ org.apache.linkis linkis-cs-common ${linkis.version} - compile + provided linkis-bml-client @@ -99,6 +99,7 @@ org.apache.linkis linkis-httpclient ${linkis.version} + provided linkis-common @@ -130,9 +131,13 @@ ${dss.version} provided + + com.fasterxml.jackson.core + jackson-databind + 2.14.1 + - diff --git a/appconn/src/main/icons/TableRules.icon b/appconn/src/main/icons/TableRules.icon new file mode 100644 index 00000000..cc3ea13b --- /dev/null +++ b/appconn/src/main/icons/TableRules.icon @@ -0,0 +1 @@ + diff --git a/appconn/src/main/icons/checkalert.icon b/appconn/src/main/icons/checkalert.icon new file mode 100644 index 00000000..cc3ea13b --- /dev/null +++ b/appconn/src/main/icons/checkalert.icon @@ -0,0 +1 @@ + diff --git a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/execution/QualitisRefExecutionOperation.java b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/execution/QualitisRefExecutionOperation.java index 35b08ed5..34f49de2 100644 --- a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/execution/QualitisRefExecutionOperation.java +++ b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/execution/QualitisRefExecutionOperation.java @@ -25,7 +25,6 @@ import com.webank.wedatasphere.dss.standard.app.development.listener.core.LongTermRefExecutionOperation; import com.webank.wedatasphere.dss.standard.app.development.listener.core.Killable; import com.webank.wedatasphere.dss.standard.app.development.listener.core.Procedure; -import com.webank.wedatasphere.dss.standard.app.development.service.DevelopmentService; import com.webank.wedatasphere.dss.standard.app.development.listener.ref.ExecutionResponseRef; import com.webank.wedatasphere.dss.standard.app.development.listener.ref.RefExecutionRequestRef; import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; @@ -53,23 +52,25 @@ */ public class QualitisRefExecutionOperation extends LongTermRefExecutionOperation implements Killable, Procedure { - private static final String SUBMIT_TASK_PATH = "qualitis/outer/api/v1/execution"; - private static final String GET_TASK_STATUS_PATH = "qualitis/outer/api/v1/application/{applicationId}/status/"; - private static final String GET_TASK_RESULT_PATH = "qualitis/outer/api/v1/application/{applicationId}/result/"; - private static final String KILL_TASK_PATH = "qualitis/outer/api/v1/execution/application/kill/{applicationId}/{executionUser}"; + + private static Logger LOGGER = LoggerFactory.getLogger(QualitisRefExecutionOperation.class); + + private static final String FILTER = "filter"; private static final String NODE_NAME_KEY = "nodeName"; private static final String RULEGROUPID = "ruleGroupId"; private static final String RULE_GROUP_ID = "rule_group_id"; private static final String EXECUTION_USER_KEY = "executionUser"; private static final String WDS_SUBMIT_USER_KEY = "wds.dss.workflow.submit.user"; + private static final String SUBMIT_TASK_PATH = "qualitis/outer/api/v1/execution"; + private static final String GET_TASK_LOG_PATH = "qualitis/outer/api/v1/application/{applicationId}/log"; + private static final String GET_TASK_STATUS_PATH = "qualitis/outer/api/v1/application/{applicationId}/status/"; + private static final String GET_TASK_RESULT_PATH = "qualitis/outer/api/v1/application/{applicationId}/result/"; + private static final String KILL_TASK_PATH = "qualitis/outer/api/v1/execution/application/kill/{applicationId}/{executionUser}"; - private static Logger LOGGER = LoggerFactory.getLogger(QualitisRefExecutionOperation.class); private String appId = "linkis_id"; private String appToken = "***REMOVED***"; - private static final String FILTER = "filter"; - @Override protected String getAppConnName() { return QualitisAppConn.QUALITIS_APPCONN_NAME; @@ -254,7 +255,7 @@ public RefExecutionState state(RefExecutionAction action) { } LOGGER.info("Start to check job. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); Map response = restTemplate.getForEntity(url, Map.class).getBody(); - String finishLog = String.format("Succeed to submit job to qualitis. response: %s", response); + String finishLog = String.format("Succeed to check job. response: %s", response); LOGGER.info(finishLog); if (response == null) { @@ -265,7 +266,7 @@ public RefExecutionState state(RefExecutionAction action) { if (! checkResponse(response)) { String message = (String) response.get("message"); - String errorMsg = String.format("Error! Can not submit job, exception: %s", message); + String errorMsg = String.format("Error! Can not check job, exception: %s", message); LOGGER.error(errorMsg); return null; } @@ -362,7 +363,7 @@ public ExecutionResponseRef result(RefExecutionAction action) { } LOGGER.info("Start to get job result. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); Map response = restTemplate.getForEntity(url, Map.class).getBody(); - String finishLog = String.format("Succeed to submit job to qualitis. response: %s", response); + String finishLog = String.format("Succeed to get job result. response: %s", response); LOGGER.info(finishLog); if (response == null) { @@ -373,7 +374,7 @@ public ExecutionResponseRef result(RefExecutionAction action) { if (! checkResponse(response)) { String message = (String) response.get("message"); - String errorMsg = String.format("Error! Can not submit job, exception: %s", message); + String errorMsg = String.format("Error! Can not get job result, exception: %s", message); LOGGER.error(errorMsg); return null; } @@ -388,6 +389,13 @@ public ExecutionResponseRef result(RefExecutionAction action) { LOGGER.info(taskMsg); LOGGER.info(resultMessage); + try { + action.getExecutionRequestRefContext().appendLog(this.log(action)); + } catch (Exception e) { + LOGGER.error("Get qualitis log failed."); + LOGGER.error(e.getMessage(), e); + } + if (failedNum != 0) { return ExecutionResponseRef.newBuilder().error(); } else { @@ -403,6 +411,50 @@ public float progress(RefExecutionAction action) { @Override public String log(RefExecutionAction action) { - return ""; + if (null == action) { + return ""; + } + + QualitisRefExecutionAction qualitisRefExecutionAction = (QualitisRefExecutionAction) action; + String applicationId = qualitisRefExecutionAction.getApplicationId(); + String executionUser = qualitisRefExecutionAction.getExecutionUser(); + LOGGER.info("Qualitis application ID: {}", applicationId); + LOGGER.info("Qualitis execution user: {}", executionUser); + if (StringUtils.isEmpty(applicationId) || StringUtils.isEmpty(executionUser)) { + return ""; + } + HttpHeaders headers = new HttpHeaders(); + headers.setContentType(MediaType.APPLICATION_JSON); + + HttpEntity entity = new HttpEntity<>(headers); + + RestTemplate restTemplate = new RestTemplate(); + String url = null; + try { + url = HttpUtils.buildUrI(getBaseUrl(), GET_TASK_LOG_PATH.replace("{applicationId}", applicationId), appId, appToken, RandomStringUtils.randomNumeric(5), String.valueOf(System.currentTimeMillis())).toString(); + } catch (NoSuchAlgorithmException e) { + LOGGER.info("Qualitis no signature algor.", e); + } catch (URISyntaxException e) { + LOGGER.error("Qualitis uri syntax exception.", e); + } + LOGGER.info("Start to get job log. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + Map response = restTemplate.getForEntity(url, Map.class).getBody(); + String finishLog = String.format("Succeed to get job log."); + LOGGER.info(finishLog); + + if (response == null) { + String errorMsg = "Error! Can not get job log, response is null"; + LOGGER.error(errorMsg); + return ""; + } + + if (! checkResponse(response)) { + String message = (String) response.get("message"); + String errorMsg = String.format("Error! Can not get job log, exception: %s", message); + LOGGER.error(errorMsg); + return ""; + } + + return (String) response.get("data"); } } diff --git a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/project/QualitisCheckTemplateSearchOperation.java b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/project/QualitisCheckTemplateSearchOperation.java index 4dbc006d..64cae9a2 100644 --- a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/project/QualitisCheckTemplateSearchOperation.java +++ b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/project/QualitisCheckTemplateSearchOperation.java @@ -5,26 +5,28 @@ import com.webank.wedatasphere.dss.appconn.qualitis.ref.entity.QualitisTemplate; import com.webank.wedatasphere.dss.appconn.qualitis.utils.HttpUtils; import com.webank.wedatasphere.dss.standard.app.structure.StructureRequestRef; -import com.webank.wedatasphere.dss.standard.app.structure.StructureRequestRefImpl; import com.webank.wedatasphere.dss.standard.app.structure.optional.AbstractOptionalOperation; -import com.webank.wedatasphere.dss.standard.app.structure.project.ref.ProjectResponseRef; -import com.webank.wedatasphere.dss.standard.common.entity.ref.RequestRefImpl; import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRefImpl; import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; -import org.apache.commons.collections.CollectionUtils; +import java.net.URISyntaxException; +import java.security.NoSuchAlgorithmException; +import java.util.Arrays; +import java.util.HashMap; +import java.util.List; +import java.util.Map; import org.apache.commons.lang.RandomStringUtils; import org.apache.linkis.common.conf.CommonVars; import org.apache.linkis.server.BDPJettyServerHelper; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import org.springframework.http.*; +import org.springframework.http.HttpEntity; +import org.springframework.http.HttpHeaders; +import org.springframework.http.HttpMethod; +import org.springframework.http.HttpStatus; +import org.springframework.http.MediaType; import org.springframework.web.client.RestTemplate; -import java.net.URISyntaxException; -import java.security.NoSuchAlgorithmException; -import java.util.*; - /** * @author leebai * @date 2022/5/19 17:00 diff --git a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/publish/QualitisRefExportOperation.java b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/publish/QualitisRefExportOperation.java index 1dacb540..0d2bbd13 100644 --- a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/publish/QualitisRefExportOperation.java +++ b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/publish/QualitisRefExportOperation.java @@ -27,14 +27,13 @@ import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; import java.net.URISyntaxException; import java.io.ByteArrayInputStream; -import java.nio.charset.StandardCharsets; import java.security.NoSuchAlgorithmException; import java.util.HashMap; import java.util.Map; import java.util.UUID; import javax.ws.rs.HttpMethod; -import org.apache.linkis.bml.client.BmlClient; import org.apache.commons.lang.RandomStringUtils; +import org.apache.linkis.bml.client.BmlClient; import org.apache.linkis.bml.client.BmlClientFactory; import org.apache.linkis.bml.protocol.BmlUploadResponse; import org.slf4j.Logger; @@ -110,8 +109,9 @@ public ExportResponseRef exportRef(RefJobContentRequestRefImpl requestRef) throw } LOGGER.info("Start to export to qualitis. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); Map response = restTemplate.getForEntity(url, Map.class).getBody(); - String finishLog = String.format("Finish to export to qualitis. response: %s", response); + String finishLog = String.format("Finish to export to qualitis. "); LOGGER.info(finishLog); + LOGGER.info("Finish to export to qualitis."); if (response == null) { String errorMsg = "Error! Can not export, response is null"; @@ -127,9 +127,9 @@ public ExportResponseRef exportRef(RefJobContentRequestRefImpl requestRef) throw } ObjectMapper objectMapper = new ObjectMapper(); Map data = (Map) response.get("data"); - String dataString; + byte[] dataString; try { - dataString = objectMapper.writeValueAsString(data); + dataString = objectMapper.writeValueAsBytes(data); } catch (JsonProcessingException e) { LOGGER.error("Error when parse export responses to json.", e); throw new ExternalOperationFailedException(90156, "Error when parse export responses to json.", e); @@ -139,7 +139,7 @@ public ExportResponseRef exportRef(RefJobContentRequestRefImpl requestRef) throw */ BmlClient bmlClient = BmlClientFactory.createBmlClient(DEFAULT_USER); BmlUploadResponse bmlUploadResponse = bmlClient.uploadResource(DEFAULT_USER, - "Qualitis_exported_" + UUID.randomUUID().toString(), new ByteArrayInputStream(dataString.getBytes(StandardCharsets.UTF_8))); + "Qualitis_exported_" + UUID.randomUUID().toString(), new ByteArrayInputStream(dataString)); Map resourceMap = new HashMap(); resourceMap.put("resourceId", bmlUploadResponse.resourceId()); resourceMap.put("version", bmlUploadResponse.version()); diff --git a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/publish/QualitisRefImportOperation.java b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/publish/QualitisRefImportOperation.java index 1f39740d..7147b53b 100644 --- a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/publish/QualitisRefImportOperation.java +++ b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/publish/QualitisRefImportOperation.java @@ -26,12 +26,11 @@ import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef.ImportWitContextRequestRefImpl; import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; import java.io.IOException; -import java.nio.charset.Charset; import java.net.URISyntaxException; import java.nio.charset.StandardCharsets; import java.security.NoSuchAlgorithmException; -import java.util.HashMap; import java.util.Map; +import org.apache.commons.httpclient.HttpStatus; import org.apache.commons.io.IOUtils; import org.apache.linkis.bml.client.BmlClient; import org.apache.commons.lang.RandomStringUtils; @@ -39,10 +38,10 @@ import org.apache.linkis.bml.protocol.BmlDownloadResponse; import org.slf4j.Logger; import org.slf4j.LoggerFactory; +import org.springframework.http.MediaType; import org.springframework.http.HttpEntity; -import org.springframework.http.HttpHeaders; import org.springframework.http.HttpMethod; -import org.springframework.http.MediaType; +import org.springframework.http.HttpHeaders; import org.springframework.http.converter.StringHttpMessageConverter; import org.springframework.web.client.RestTemplate; @@ -68,26 +67,30 @@ public class QualitisRefImportOperation extends QualitisDevelopmentOperation resourceMap = requestRef.getResourceMap(); - LOGGER.info("Import request body" + new Gson().toJson(requestRef)); /** * BML client download opeartion. */ BmlClient bmlClient = BmlClientFactory.createBmlClient(DEFAULT_USER); BmlDownloadResponse bmlDownloadResponse = bmlClient.downloadResource(DEFAULT_USER, resourceMap.get("resourceId").toString(), resourceMap.get("version").toString()); ObjectMapper objectMapper = new ObjectMapper(); - String dataJsonString = ""; + byte[] dataJsonString = new byte[128]; try { - dataJsonString = IOUtils.toString(bmlDownloadResponse.inputStream(), Charset.defaultCharset()); + dataJsonString = IOUtils.toByteArray(bmlDownloadResponse.inputStream()); } catch (IOException e) { LOGGER.error("Error with bml download and mapper to json.", e); } Map data = null; try { data = objectMapper.readValue(dataJsonString, Map.class); - LOGGER.info("BML downloaded data: ", data.toString()); } catch (JsonProcessingException e) { LOGGER.error("BML parse error."); + } catch (IOException e) { + LOGGER.error("Read json value error."); } HttpHeaders headers = new HttpHeaders(); headers.setContentType(MediaType.APPLICATION_JSON); @@ -95,11 +98,12 @@ public RefJobContentResponseRef importRef(ImportWitContextRequestRefImpl request data.put("newProjectId", requestRef.getRefProjectId()); data.put("userName", requestRef.getUserName()); + data.put("csId", requestRef.getContextId()); - HttpEntity entity = new HttpEntity<>(new Gson().toJson(data), headers); + HttpEntity entity = new HttpEntity<>(gson.toJson(data), headers); try { String url = HttpUtils.buildUrI(getBaseUrl(), IMPORT_RULE_URL, appId, appToken, RandomStringUtils.randomNumeric(5), String.valueOf(System.currentTimeMillis())).toString(); - LOGGER.info("Start to import rule. url: {}, method: {}, body: {}", url, HttpMethod.PUT, entity); + LOGGER.info("Start to import rule. url: {}, method: {}", url, HttpMethod.PUT); RestTemplate restTemplate = new RestTemplate(); restTemplate.getMessageConverters().set(1, new StringHttpMessageConverter(StandardCharsets.UTF_8)); Map response = restTemplate.exchange(url, HttpMethod.PUT, entity, Map.class).getBody(); @@ -108,15 +112,16 @@ public RefJobContentResponseRef importRef(ImportWitContextRequestRefImpl request throw new ExternalOperationFailedException(90157, "import qualitis appconn node exception."); } String code = response.get("code").toString(); - if (! "200".equals(code)) { - LOGGER.error("Failed to import rule. Response is not OK. Error message : {}",(String)response.get("message")); - throw new ExternalOperationFailedException(90157, "import qualitis appconn node exception : "+(String)response.get("message")); + if (!(HttpStatus.SC_OK + "").equals(code)) { + LOGGER.error("Failed to import rule. Response is not OK. Error message : {}", response.get("message")); + throw new ExternalOperationFailedException(90157, "import qualitis appconn node exception : " + response.get("message")); } LOGGER.info("Finished to import rule. response: {}", response); ruleGroupInfo = (Map) response.get("data"); if (ruleGroupInfo.containsKey("rule_group_id") && ruleGroupInfo.get("rule_group_id") != null) { Object ruleGroupId = ruleGroupInfo.get("rule_group_id"); ruleGroupInfo.remove("rule_group_id"); + ruleGroupInfo.put("ruleGroupId", ruleGroupId); } diff --git a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefCopyOperation.java b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefCopyOperation.java index 49aafd8e..7cc8077b 100644 --- a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefCopyOperation.java +++ b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefCopyOperation.java @@ -17,26 +17,27 @@ package com.webank.wedatasphere.dss.appconn.qualitis.ref.operation; import com.google.gson.Gson; -import com.webank.wedatasphere.dss.appconn.qualitis.QualitisAppConn; -import com.webank.wedatasphere.dss.appconn.qualitis.publish.QualitisDevelopmentOperation; import com.webank.wedatasphere.dss.appconn.qualitis.utils.HttpUtils; +import com.webank.wedatasphere.dss.appconn.qualitis.QualitisAppConn; import com.webank.wedatasphere.dss.standard.app.development.operation.RefCopyOperation; -import com.webank.wedatasphere.dss.standard.app.development.ref.RefJobContentResponseRef; import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef.CopyWitContextRequestRefImpl; +import com.webank.wedatasphere.dss.standard.app.development.ref.RefJobContentResponseRef; +import com.webank.wedatasphere.dss.appconn.qualitis.publish.QualitisDevelopmentOperation; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef.CopyWitContextAndDSSJobContentRequestRefImpl; import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; -import java.net.URISyntaxException; +import com.webank.wedatasphere.dss.standard.app.development.utils.DSSJobContentConstant; +import org.apache.commons.lang.RandomStringUtils; import java.security.NoSuchAlgorithmException; +import java.net.URISyntaxException; import java.util.HashMap; import java.util.Map; -import org.apache.commons.lang.RandomStringUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; +import org.springframework.http.MediaType; import org.springframework.http.HttpEntity; -import org.springframework.http.HttpHeaders; import org.springframework.http.HttpMethod; import org.springframework.http.HttpStatus; -import org.springframework.http.MediaType; +import org.springframework.http.HttpHeaders; import org.springframework.web.client.RestClientException; import org.springframework.web.client.RestTemplate; @@ -44,8 +45,9 @@ * @author allenzhou@webank.com * @date 2021/6/21 14:40 */ -public class QualitisRefCopyOperation extends QualitisDevelopmentOperation - implements RefCopyOperation { +public class QualitisRefCopyOperation extends QualitisDevelopmentOperation + implements RefCopyOperation { + private static final Gson gson = new Gson(); private static final String COPY_RULE_URL = "/qualitis/outer/api/v1/projector/rule/copy"; private static final Logger LOGGER = LoggerFactory.getLogger(QualitisRefDeletionOperation.class); @@ -58,8 +60,7 @@ protected String getAppConnName() { } @Override - public RefJobContentResponseRef copyRef(CopyWitContextRequestRefImpl requestRef) throws ExternalOperationFailedException { - Gson gson = new Gson(); + public RefJobContentResponseRef copyRef(CopyWitContextAndDSSJobContentRequestRefImpl requestRef) throws ExternalOperationFailedException { LOGGER.info("Qualitis copy request: " + gson.toJson(requestRef)); Map jobContent = requestRef.getRefJobContent(); String url; @@ -72,7 +73,7 @@ public RefJobContentResponseRef copyRef(CopyWitContextRequestRefImpl requestRef) LOGGER.error("Qualitis uri syntax exception.", e); throw new ExternalOperationFailedException(90156, "Construct copy outer url failed when copy."); } - if (! jobContent.containsKey("ruleGroupId") || jobContent.get("ruleGroupId") == null) { + if (jobContent == null || !jobContent.containsKey("ruleGroupId") || jobContent.get("ruleGroupId") == null) { throw new ExternalOperationFailedException(90156, "Rule group ID or username is null when copy."); } Integer ruleGroupId = null; @@ -87,10 +88,14 @@ public RefJobContentResponseRef copyRef(CopyWitContextRequestRefImpl requestRef) HttpHeaders headers = new HttpHeaders(); headers.setContentType(MediaType.APPLICATION_JSON); Map request = new HashMap<>(4); - request.put("create_user", requestRef.getUserName()); request.put("version", requestRef.getNewVersion()); + request.put("create_user", requestRef.getUserName()); request.put("source_rule_group_id", Long.valueOf(ruleGroupId.toString())); - request.put("target_project_id", requestRef.getParameter("projectId")); + // When copy cross projects, target project may be different with above. + request.put("target_project_id", requestRef.getParameter("refProjectId")); + request.put("work_flow_name", requestRef.getDSSJobContent().get(DSSJobContentConstant.ORCHESTRATION_NAME)); + request.put("cs_id", requestRef.getContextId()); + request.put("node_name", requestRef.getName()); HttpEntity entity = new HttpEntity<>(gson.toJson(request), headers); diff --git a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefDeletionOperation.java b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefDeletionOperation.java index 6492422f..54a15203 100644 --- a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefDeletionOperation.java +++ b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefDeletionOperation.java @@ -61,7 +61,7 @@ protected String getAppConnName() { @Override public ResponseRef deleteRef(RefJobContentRequestRefImpl requestRef) throws ExternalOperationFailedException { // Get rule group info from request. - LOGGER.info("Start to get the job content when delete ref."); + LOGGER.info("Start to get the job content when delete."); Map jobContent = requestRef.getRefJobContent(); LOGGER.info("The job content when delete ref is:" + jobContent); String url; @@ -89,6 +89,7 @@ public ResponseRef deleteRef(RefJobContentRequestRefImpl requestRef) throws Exte try { HttpHeaders headers = new HttpHeaders(); headers.setContentType(MediaType.APPLICATION_JSON); + Map request = new HashMap<>(); request.put("rule_group_id", Long.valueOf(ruleGroupId.toString())); diff --git a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefQueryOperation.java b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefQueryOperation.java index a2b37009..14828428 100644 --- a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefQueryOperation.java +++ b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefQueryOperation.java @@ -16,16 +16,17 @@ package com.webank.wedatasphere.dss.appconn.qualitis.ref.operation; +import com.google.gson.Gson; import com.webank.wedatasphere.dss.common.label.EnvDSSLabel; import com.webank.wedatasphere.dss.standard.app.development.operation.RefQueryJumpUrlOperation; import com.webank.wedatasphere.dss.appconn.qualitis.publish.QualitisDevelopmentOperation; import com.webank.wedatasphere.dss.standard.app.development.ref.QueryJumpUrlResponseRef; import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef.QueryJumpUrlRequestRefImpl; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef.QueryJumpUrlWithDSSJobContentRequestRefImpl; +import com.webank.wedatasphere.dss.standard.app.development.utils.DSSJobContentConstant; import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; import java.io.UnsupportedEncodingException; import java.net.URLEncoder; -import java.util.Map; import java.net.URL; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -34,38 +35,70 @@ * @author allenzhou@webank.com * @date 2021/6/21 14:40 */ -public class QualitisRefQueryOperation extends QualitisDevelopmentOperation - implements RefQueryJumpUrlOperation { +public class QualitisRefQueryOperation extends QualitisDevelopmentOperation + implements RefQueryJumpUrlOperation { private static Logger LOGGER = LoggerFactory.getLogger(QualitisRefQueryOperation.class); private static final String BASH = "bash"; + private static final String CHECK_ALERT = "checkalert"; + private static final String TABLE_RULES = "TableRules"; - public String getEnvUrl(String url, String host, int port, QueryJumpUrlRequestRefImpl qualitisOpenRequestRef) throws UnsupportedEncodingException { - String env = qualitisOpenRequestRef.getDSSLabels().stream().filter(dssLabel -> dssLabel instanceof EnvDSSLabel) - .map(dssLabel -> (EnvDSSLabel) dssLabel).findAny().get().getEnv(); - Long projectId = qualitisOpenRequestRef.getRefProjectId(); - String redirectUrl = "http://" + host + ":" + port + "/#/addGroupTechniqueRule?tableType=1&id=" + projectId + "&ruleGroupId=${ruleGroupId}&nodeId=${nodeId}&contextID=${contextID}&nodeName=${nodeName}"; - return url + "?redirect=" + URLEncoder.encode(redirectUrl + "&env=" + env.toLowerCase(), "UTF-8") + "&dssurl=${dssurl}&cookies=${cookies}"; + private static final Gson gson = new Gson(); + + public String getEnvUrl(String url, String host, int port, QueryJumpUrlWithDSSJobContentRequestRefImpl qualitisOpenRequestRef, String workflowName, String workflowVersion) throws UnsupportedEncodingException { + String redirectUrl = "http://" + host + ":" + port + "/#/projects/rules?workflowProject=true&tpl=newSingleTableRule&projectId=" + qualitisOpenRequestRef.getRefProjectId() + "&ruleGroupId=${ruleGroupId}&nodeId=${nodeId}&contextID=${contextID}&nodeName=${nodeName}&dssurl=${dssurl}&workflowName=" + workflowName + "&workflowVersion=" + workflowVersion; + + return url + "?redirect=" + URLEncoder.encode(redirectUrl + "&env=" + getEnv(qualitisOpenRequestRef).toLowerCase(), "UTF-8") + "&dssurl=${dssurl}&cookies=${cookies}"; + } + + public String getEnvUrlForBash(String url, String host, int port, QueryJumpUrlWithDSSJobContentRequestRefImpl qualitisOpenRequestRef, String workflowName, String workflowVersion) throws UnsupportedEncodingException { + String redirectUrl = "http://" + host + ":" + port + "/#/scripts?workflowProject=true&projectId=" + qualitisOpenRequestRef.getRefProjectId() + "&ruleGroupId=${ruleGroupId}&nodeId=${nodeId}&contextID=${contextID}&nodeName=${nodeName}&dssurl=${dssurl}&workflowName=" + workflowName + "&workflowVersion=" + workflowVersion; + + return url + "?redirect=" + URLEncoder.encode(redirectUrl + "&env=" + getEnv(qualitisOpenRequestRef).toLowerCase(), "UTF-8") + "&dssurl=${dssurl}&cookies=${cookies}"; + } + + public String getEnvUrlForCheckAlert(String url, String host, int port, QueryJumpUrlWithDSSJobContentRequestRefImpl qualitisOpenRequestRef, String workflowName, String workflowVersion) throws UnsupportedEncodingException { + String redirectUrl = "http://" + host + ":" + port + "/#/checkAlert?workflowProject=true&projectId=" + qualitisOpenRequestRef.getRefProjectId() + "&ruleGroupId=${ruleGroupId}&nodeId=${nodeId}&contextID=${contextID}&nodeName=${nodeName}&dssurl=${dssurl}&workflowName=" + workflowName + "&workflowVersion=" + workflowVersion; + + return url + "?redirect=" + URLEncoder.encode(redirectUrl + "&env=" + getEnv(qualitisOpenRequestRef).toLowerCase(), "UTF-8") + "&dssurl=${dssurl}&cookies=${cookies}"; + } + + public String getEnvUrlForTableCheckRules(String url, String host, int port, QueryJumpUrlWithDSSJobContentRequestRefImpl qualitisOpenRequestRef, String workflowName, String workflowVersion) throws UnsupportedEncodingException { + String redirectUrl = "http://" + host + ":" + port + "/#/projects/tableGroupRules?workflowProject=true&projectId=" + qualitisOpenRequestRef.getRefProjectId() + "&ruleGroupId=${ruleGroupId}&nodeId=${nodeId}&contextID=${contextID}&nodeName=${nodeName}&dssurl=${dssurl}&workflowName=" + workflowName + "&workflowVersion=" + workflowVersion; + + return url + "?redirect=" + URLEncoder.encode(redirectUrl + "&env=" + getEnv(qualitisOpenRequestRef).toLowerCase(), "UTF-8") + "&dssurl=${dssurl}&cookies=${cookies}"; } - public String getEnvUrlForBash(String url, String host, int port, QueryJumpUrlRequestRefImpl qualitisOpenRequestRef) throws UnsupportedEncodingException { - String env = qualitisOpenRequestRef.getDSSLabels().stream().filter(dssLabel -> dssLabel instanceof EnvDSSLabel) - .map(dssLabel -> (EnvDSSLabel) dssLabel).findAny().get().getEnv(); - Long projectId = qualitisOpenRequestRef.getRefProjectId(); - String redirectUrl = "http://" + host + ":" + port + "/#/scripts?workflowProject=true&projectId=" + projectId + "&ruleGroupId=${ruleGroupId}&nodeId=${nodeId}&contextID=${contextID}&nodeName=${nodeName}"; - return url + "?redirect=" + URLEncoder.encode(redirectUrl + "&env=" + env.toLowerCase(), "UTF-8") + "&dssurl=${dssurl}&cookies=${cookies}"; + public String getEnv(QueryJumpUrlWithDSSJobContentRequestRefImpl qualitisOpenRequestRef) { + return qualitisOpenRequestRef.getDSSLabels().stream().filter(dssLabel -> dssLabel instanceof EnvDSSLabel).map(dssLabel -> (EnvDSSLabel) dssLabel).findAny().get().getEnv(); } @Override - public QueryJumpUrlResponseRef query(QueryJumpUrlRequestRefImpl qualitisOpenRequestRef) throws ExternalOperationFailedException { + public QueryJumpUrlResponseRef query(QueryJumpUrlWithDSSJobContentRequestRefImpl qualitisOpenRequestRef) throws ExternalOperationFailedException { try { String baseUrl = getAppInstance().getBaseUrl() + "qualitis/api/v1/redirect"; - LOGGER.info("Get base url from app instancd and retur redirect url: " + baseUrl); + LOGGER.info("Qualitis query request: " + gson.toJson(qualitisOpenRequestRef)); + LOGGER.info("Get base url from app instance and return redirect url: " + baseUrl); URL url = new URL(baseUrl); + String host = url.getHost(); int port = url.getPort(); - String retJumpUrl = getEnvUrl(baseUrl, host, port, qualitisOpenRequestRef); + if (port < 0) { + port = 80; + } + + String workflowName = (String) qualitisOpenRequestRef.getDSSJobContent().get(DSSJobContentConstant.ORCHESTRATION_NAME); + String workflowVersion = (String) qualitisOpenRequestRef.getDSSJobContent().get(DSSJobContentConstant.ORC_VERSION_KEY); + + String retJumpUrl = getEnvUrl(baseUrl, host, port, qualitisOpenRequestRef, workflowName, workflowVersion); + if (qualitisOpenRequestRef.getType().contains(BASH)) { + retJumpUrl = getEnvUrlForBash(baseUrl, host, port, qualitisOpenRequestRef, workflowName, workflowVersion); + } else if (qualitisOpenRequestRef.getType().contains(CHECK_ALERT)) { + retJumpUrl = getEnvUrlForCheckAlert(baseUrl, host, port, qualitisOpenRequestRef, workflowName, workflowVersion); + } else if(qualitisOpenRequestRef.getType().contains(TABLE_RULES)){ + retJumpUrl = getEnvUrlForTableCheckRules(baseUrl, host, port, qualitisOpenRequestRef, workflowName, workflowVersion); + } return QueryJumpUrlResponseRef.newBuilder().setJumpUrl(retJumpUrl).success(); } catch (Exception e) { throw new ExternalOperationFailedException(90177, "Failed to parse jobContent ", e); diff --git a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefUpdateOperation.java b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefUpdateOperation.java index 63554ec7..fda92051 100644 --- a/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefUpdateOperation.java +++ b/appconn/src/main/java/com/webank/wedatasphere/dss/appconn/qualitis/ref/operation/QualitisRefUpdateOperation.java @@ -18,23 +18,26 @@ import com.google.gson.Gson; import com.webank.wedatasphere.dss.appconn.qualitis.QualitisAppConn; -import com.webank.wedatasphere.dss.appconn.qualitis.utils.HttpUtils; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; import com.webank.wedatasphere.dss.appconn.qualitis.publish.QualitisDevelopmentOperation; +import com.webank.wedatasphere.dss.appconn.qualitis.utils.HttpUtils; import com.webank.wedatasphere.dss.standard.app.development.operation.RefUpdateOperation; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef; +import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef.UpdateWitContextRequestRefImpl; import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRef; -import com.webank.wedatasphere.dss.standard.common.entity.ref.ResponseRefImpl; import com.webank.wedatasphere.dss.standard.common.exception.operation.ExternalOperationFailedException; -import com.webank.wedatasphere.dss.standard.app.development.ref.impl.ThirdlyRequestRef.UpdateWitContextRequestRefImpl; +import java.net.URISyntaxException; +import java.security.NoSuchAlgorithmException; import java.util.HashMap; import java.util.Map; +import org.apache.commons.lang.RandomStringUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.http.HttpEntity; import org.springframework.http.HttpHeaders; +import org.springframework.http.HttpMethod; import org.springframework.http.HttpStatus; import org.springframework.http.MediaType; -import org.apache.commons.lang.RandomStringUtils; +import org.springframework.web.client.RestClientException; import org.springframework.web.client.RestTemplate; /** @@ -44,57 +47,75 @@ public class QualitisRefUpdateOperation extends QualitisDevelopmentOperation implements RefUpdateOperation { - private static final String UPDATE_RULE_PATH = "/qualitis/outer/api/v1/projector/rule/modify"; - - private String appId = "linkis_id"; - private String appToken = "***REMOVED***"; - + private static final String MODIFY_RULE_URL = "/qualitis/outer/api/v1/projector/rule/modify"; private final static Logger LOGGER = LoggerFactory.getLogger(QualitisRefUpdateOperation.class); + private static String appId = "linkis_id"; + private static String appToken = "***REMOVED***"; + @Override protected String getAppConnName() { return QualitisAppConn.QUALITIS_APPCONN_NAME; } - private ResponseRef updateQualitisCS(UpdateWitContextRequestRefImpl qualitisUpdateCSRequestRef) throws ExternalOperationFailedException { - LOGGER.info("Update CS request body" + new Gson().toJson(qualitisUpdateCSRequestRef)); - String csId = qualitisUpdateCSRequestRef.getContextId(); - String projectName = qualitisUpdateCSRequestRef.getProjectName(); - - LOGGER.info("Start set context for qualitis node: {}", qualitisUpdateCSRequestRef.getName()); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - Map requestPayLoad = new HashMap<>(); + @Override + public ResponseRef updateRef(UpdateWitContextRequestRefImpl requestRef) throws ExternalOperationFailedException { + // Get rule group info from request. + LOGGER.info("Start to get the job content when modify."); + Map jobContent = requestRef.getRefJobContent(); + LOGGER.info("The job content when modify ref is:" + jobContent); + String url; try { - requestPayLoad.put("cs_id", csId); - requestPayLoad.put("rule_name", qualitisUpdateCSRequestRef.getName()); - requestPayLoad.put("project_name", projectName); - requestPayLoad.put("project_id", qualitisUpdateCSRequestRef.getRefProjectId()); + url = HttpUtils.buildUrI(getBaseUrl(), MODIFY_RULE_URL, appId, appToken, RandomStringUtils.randomNumeric(5), String.valueOf(System.currentTimeMillis())).toString(); + } catch (NoSuchAlgorithmException e) { + LOGGER.error("Modify rule failed. Rule group info: {}, exception: {}", jobContent, e); + throw new ExternalOperationFailedException(90156, "Construct modify outer url failed when delete."); + } catch (URISyntaxException e) { + LOGGER.error("Qualitis uri syntax exception.", e); + throw new ExternalOperationFailedException(90156, "Construct modify outer url failed when delete."); + } + if (jobContent == null || ! jobContent.containsKey("ruleGroupId") || jobContent.get("ruleGroupId") == null) { + LOGGER.info("Rule group ID is null when modify."); + return ResponseRef.newExternalBuilder().success(); + } + Integer ruleGroupId = null; + if(jobContent.get("ruleGroupId") instanceof Double){ + Double tempId = (Double)jobContent.get("ruleGroupId"); + ruleGroupId = tempId.intValue(); + }else if(jobContent.get("ruleGroupId") instanceof Integer){ + ruleGroupId = (Integer)jobContent.get("ruleGroupId"); + } + LOGGER.info("Rules in {} will be modified.", ruleGroupId.toString()); + try { + HttpHeaders headers = new HttpHeaders(); + headers.setContentType(MediaType.APPLICATION_JSON); - RestTemplate restTemplate = new RestTemplate(); + Map request = new HashMap<>(); + request.put("rule_group_id", Long.valueOf(ruleGroupId.toString())); + request.put("node_name", requestRef.getName()); + + HttpEntity entity = new HttpEntity<>(new Gson().toJson(request), headers); - HttpEntity entity = new HttpEntity<>(new Gson().toJson(requestPayLoad), headers); - String url = HttpUtils.buildUrI(getBaseUrl(), UPDATE_RULE_PATH, appId, appToken, - RandomStringUtils.randomNumeric(5), String.valueOf(System.currentTimeMillis())).toString(); - LOGGER.info("Set context service url is {}", url); - Map response = restTemplate.exchange(url, org.springframework.http.HttpMethod.POST, entity, Map.class).getBody(); + LOGGER.info("Start to modify rule. url: {}, method: {}, body: {}", String.format("%s/%s", url, MODIFY_RULE_URL), javax.ws.rs.HttpMethod.POST, entity); + + RestTemplate restTemplate = new RestTemplate(); + Map response = restTemplate.exchange(url, HttpMethod.POST, entity, Map.class).getBody(); if (response == null) { - LOGGER.error("Failed to delete rule. Response is null."); - throw new ExternalOperationFailedException(90176, "Failed to delete rule. Response is null."); + LOGGER.error("Failed to modify rule. Response is null."); + throw new ExternalOperationFailedException(90156, "Modify outer url return null."); } + LOGGER.info("Finished to modify rule. Response: {}", response); String status = response.get("code").toString(); - if (! "200".equals(status)) { - String errorMsg = response.get("message").toString(); - throw new ExternalOperationFailedException(90176, errorMsg); + if (status != null && HttpStatus.OK.value() == Integer.parseInt(status)) { + LOGGER.info("Modify rule successfully."); + } else { + LOGGER.error("Modify rule failed."); + throw new ExternalOperationFailedException(90156, "Modify outer url return status is not ok."); } - return new ResponseRefImpl(new Gson().toJson(response), HttpStatus.OK.value(), "", response); - } catch (Exception e){ - throw new ExternalOperationFailedException(90176,"Set context Qualitis AppJointNode Exception", e); + } catch (RestClientException e) { + LOGGER.error("Failed to modify rule because of restTemplate exception. Exception is: {}", e); + throw new ExternalOperationFailedException(90156, "Modify outer url request failed with rest template."); } - } - - @Override - public ResponseRef updateRef(UpdateWitContextRequestRefImpl requestRef) throws ExternalOperationFailedException { - return updateQualitisCS(requestRef); + return ResponseRef.newExternalBuilder().success(); } } diff --git a/appconn/src/main/resources/init.sql b/appconn/src/main/resources/init.sql index 0ad99157..60ea4baf 100644 --- a/appconn/src/main/resources/init.sql +++ b/appconn/src/main/resources/init.sql @@ -23,7 +23,6 @@ INSERT INTO `dss_appconn` ( 'DSS_INSTALL_HOME_VAL/dss-appconns/qualitis', ''); - select @dss_appconn_qualitisId:=id from `dss_appconn` where `appconn_name` = 'qualitis'; INSERT INTO `dss_appconn_instance`( @@ -38,6 +37,18 @@ INSERT INTO `dss_appconn_instance`( '{"reqUri":""}', '#/dashboard'); +INSERT INTO `dss_appconn_instance`( + `appconn_id`, + `label`, + `url`, + `enhance_json`, + `homepage_uri`) + VALUES (@dss_appconn_qualitisId, + 'PROD', + 'http://APPCONN_INSTALL_IP:APPCONN_INSTALL_PORT/', + '', + '#/dashboard'); + select @dss_qualitis_appconnId:=id from `dss_appconn` WHERE `appconn_name` in ('qualitis'); select @qualitis_menuId:=id from `dss_workspace_menu` WHERE `name` in ('数据质量'); @@ -87,11 +98,26 @@ VALUES (@dss_qualitis_appconnId, 'shujuzhiliang-icon'); select @dss_qualitisId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis'; +select @dss_qualitisBashId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.bash'; +select @dss_qualitisCheckAlertId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.checkalert'; +select @dss_qualitisTableCheckRulesId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.TableRules'; delete from `dss_workflow_node_to_ui` where `workflow_node_id`=@dss_qualitisId; delete from `dss_workflow_node_to_group` where `node_id`=@dss_qualitisId; delete from `dss_workflow_node` where `node_type`='linkis.appconn.qualitis'; +delete from `dss_workflow_node_to_ui` where `workflow_node_id`=@dss_qualitisBashId; +delete from `dss_workflow_node_to_group` where `node_id`=@dss_qualitisBashId; +delete from `dss_workflow_node` where `node_type`='linkis.appconn.qualitis.bash'; + +delete from `dss_workflow_node_to_ui` where `workflow_node_id`=@dss_qualitisCheckAlertId; +delete from `dss_workflow_node_to_group` where `node_id`=@dss_qualitisCheckAlertId; +delete from `dss_workflow_node` where `node_type`='linkis.appconn.qualitis.checkalert'; + +delete from `dss_workflow_node_to_ui` where `workflow_node_id`=@dss_qualitisTableCheckRulesId; +delete from `dss_workflow_node_to_group` where `node_id`=@dss_qualitisTableCheckRulesId; +delete from `dss_workflow_node` where `node_type`='linkis.appconn.qualitis.TableRules'; + INSERT INTO `dss_workflow_node` ( `icon_path`, `node_type`, @@ -111,9 +137,83 @@ INSERT INTO `dss_workflow_node` ( 0, 1, 1, - 'qualitis'); + 'CheckRules'); +INSERT INTO `dss_workflow_node` ( + `icon_path`, + `node_type`, + `appconn_name`, + `submit_to_scheduler`, + `enable_copy`, + `should_creation_before_node`, + `support_jump`, + `jump_type`, + `name`) + VALUES ( + 'icons/bash.icon', + 'linkis.appconn.qualitis.bash', + 'qualitis', + 1, + 0, + 0, + 1, + 1, + 'ShellRules'); +INSERT INTO `dss_workflow_node` ( + `icon_path`, + `node_type`, + `appconn_name`, + `submit_to_scheduler`, + `enable_copy`, + `should_creation_before_node`, + `support_jump`, + `jump_type`, + `name`) + VALUES ( + 'icons/checkalert.icon', + 'linkis.appconn.qualitis.checkalert', + 'qualitis', + 1, + 0, + 0, + 1, + 1, + 'CheckAlert'); + +INSERT INTO `dss_workflow_node` ( + `icon_path`, + `node_type`, + `appconn_name`, + `submit_to_scheduler`, + `enable_copy`, + `should_creation_before_node`, + `support_jump`, + `jump_type`, + `name`) + VALUES ( + 'icons/TableRules.icon', + 'linkis.appconn.qualitis.TableRules', + 'qualitis', + 1, + 0, + 0, + 1, + 1, + 'TableRules'); select @dss_qualitis_nodeId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis'; +select @dss_qualitisBash_nodeId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.bash'; +select @dss_qualitisCheckAlert_nodeId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.checkalert'; +select @dss_qualitisTableCheckRules_nodeId:=id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.TableRules'; INSERT INTO `dss_workflow_node_to_group` (`node_id`, `group_id`) VALUES (@dss_qualitis_nodeId, 3); INSERT INTO `dss_workflow_node_to_ui` (workflow_node_id, ui_id) VALUES (@dss_qualitis_nodeId, 1),(@dss_qualitis_nodeId,3),(@dss_qualitis_nodeId,5),(@dss_qualitis_nodeId,6),(@dss_qualitis_nodeId,35),(@dss_qualitis_nodeId,36); + +INSERT INTO `dss_workflow_node_to_group` (`node_id`, `group_id`) VALUES (@dss_qualitisBash_nodeId, 3); +INSERT INTO `dss_workflow_node_to_ui` (workflow_node_id, ui_id) VALUES (@dss_qualitisBash_nodeId, 1),(@dss_qualitisBash_nodeId,3),(@dss_qualitisBash_nodeId,5),(@dss_qualitisBash_nodeId,6),(@dss_qualitisBash_nodeId,35),(@dss_qualitisBash_nodeId,36); + +INSERT INTO `dss_workflow_node_to_group` (`node_id`, `group_id`) VALUES (@dss_qualitisCheckAlert_nodeId, 3); +INSERT INTO `dss_workflow_node_to_ui` (workflow_node_id, ui_id) VALUES (@dss_qualitisCheckAlert_nodeId, 1),(@dss_qualitisCheckAlert_nodeId,3),(@dss_qualitisCheckAlert_nodeId,5),(@dss_qualitisCheckAlert_nodeId,6),(@dss_qualitisCheckAlert_nodeId,35),(@dss_qualitisCheckAlert_nodeId,36); + +INSERT INTO `dss_workflow_node_to_group` (`node_id`, `group_id`) VALUES (@dss_qualitisTableCheckRules_nodeId, 3); +INSERT INTO `dss_workflow_node_to_ui` (workflow_node_id, ui_id) VALUES (@dss_qualitisTableCheckRules_nodeId, 1),(@dss_qualitisTableCheckRules_nodeId,3),(@dss_qualitisTableCheckRules_nodeId,5),(@dss_qualitisTableCheckRules_nodeId,6),(@dss_qualitisTableCheckRules_nodeId,35),(@dss_qualitisTableCheckRules_nodeId,36); + diff --git a/appconn/src/main/resources/update_to_1.1.10.sql b/appconn/src/main/resources/update_to_1.1.10.sql new file mode 100644 index 00000000..efc288b1 --- /dev/null +++ b/appconn/src/main/resources/update_to_1.1.10.sql @@ -0,0 +1,29 @@ +delete from `dss_workflow_node_to_ui` where `workflow_node_id`=(select id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.checkalert'); +delete from `dss_workflow_node_to_group` where `node_id`=(select id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.checkalert'); +delete from `dss_workflow_node` where `node_type`='linkis.appconn.qualitis.checkalert'; + +INSERT INTO `dss_workflow_node` ( + `icon_path`, + `node_type`, + `appconn_name`, + `submit_to_scheduler`, + `enable_copy`, + `should_creation_before_node`, + `support_jump`, + `jump_type`, + `name`) + VALUES ( + 'icons/checkalert.icon', + 'linkis.appconn.qualitis.checkalert', + 'qualitis', + 1, + 0, + 0, + 1, + 1, + 'checkalert'); + + + +INSERT INTO `dss_workflow_node_to_group` (`node_id`, `group_id`) VALUES ((select id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.checkalert'), 3); +INSERT INTO `dss_workflow_node_to_ui` (workflow_node_id, ui_id) VALUES ((select id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.checkalert'), 1), ((select id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.checkalert'),3), ((select id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.checkalert'),5), ((select id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.checkalert'),6), ((select id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.checkalert'),35), ((select id from `dss_workflow_node` where `node_type` = 'linkis.appconn.qualitis.checkalert'),36); diff --git a/build.gradle b/build.gradle index 0641d5df..5ccf71a3 100644 --- a/build.gradle +++ b/build.gradle @@ -35,11 +35,15 @@ buildscript { maven { url "http://maven.aliyun.com/nexus/content/groups/public/" } + maven { + url "http://maven.javastream.de/" + } +// mavenCentral() } dependencies { classpath 'net.sf.proguard:proguard-gradle:6.2.0' - classpath 'org.springframework.boot:spring-boot-gradle-plugin:1.5.15.RELEASE' + classpath 'org.springframework.boot:spring-boot-gradle-plugin:2.7.17' } } @@ -52,6 +56,9 @@ repositories { maven { url "http://maven.aliyun.com/nexus/content/groups/public/" } + maven { + url "http://maven.javastream.de/" + } } mainClassName = "com.webank.wedatasphere.qualitis.QualitisServer" @@ -95,11 +102,6 @@ distributions { dirMode = 0755 fileMode = 0755 } - into { "doc" } { - from "docs" - dirMode = 0755 - fileMode = 0644 - } } } } @@ -117,7 +119,9 @@ subprojects { maven { url "http://maven.aliyun.com/nexus/content/groups/public/" } -// mavenCentral() + maven { + url "http://maven.javastream.de/" + } } processResources { @@ -135,21 +139,9 @@ subprojects { configurations.all { resolutionStrategy { - force 'org.springframework:spring-aop:5.1.18.RELEASE' - force 'org.springframework:spring-aspects:5.1.18.RELEASE' - force 'org.springframework:spring-beans:5.1.18.RELEASE' - force 'org.springframework:spring-context:5.1.18.RELEASE' - force 'org.springframework:spring-core:5.1.18.RELEASE' - force 'org.springframework:spring-expressions:5.1.18.RELEASE' - force 'org.springframework:spring-jcl:5.1.18.RELEASE' - force 'org.springframework:spring-jdbc:5.1.18.RELEASE' - force 'org.springframework:spring-orm:5.1.18.RELEASE' - force 'org.springframework:spring-test:5.1.18.RELEASE' - force 'org.springframework:spring-tx:5.1.18.RELEASE' - force 'org.springframework:spring-web:5.1.18.RELEASE' - force 'org.springframework:spring-webmvc:5.1.18.RELEASE' - force 'org.apache.logging.log4j:log4j-slf4j-impl:2.16.0' + force 'org.apache.logging.log4j:log4j-slf4j-impl:2.17.1' } + resolutionStrategy.cacheChangingModulesFor 10,'minutes' exclude group: 'log4j', module: 'log4j' exclude group: 'org.codehaus.jackson', module: 'jackson-mapper-asl' } diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/client/AlarmClient.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/client/AlarmClient.java new file mode 100644 index 00000000..3a725e14 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/client/AlarmClient.java @@ -0,0 +1,40 @@ +package com.webank.wedatasphere.qualitis.client; + +import com.webank.wedatasphere.qualitis.config.ImsConfig; +import com.webank.wedatasphere.qualitis.entity.ReportBatchInfo; +import java.util.List; +import java.util.Map; + +/** + * @author allenzhou + */ +public interface AlarmClient { + /** + * 发送告警 + * @param receiver + * @param alertInfo + * @param alertLevel + * @param alertTitle + * @return + */ + void sendAlarm(String receiver, String alertTitle, String alertInfo, String alertLevel); + + /** + * Send new alarm. + * @param requestList + */ + void sendNewAlarm(List> requestList); + + /** + * 上报指标值 或 异常值 + * @param reportBatchInfo + */ + void report(ReportBatchInfo reportBatchInfo); + + /** + * Send Abnormal data record. + * @param imsConfig + * @param data + */ + void sendAbnormalDataRecordAlarm(ImsConfig imsConfig, List> data); +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/client/impl/ImsAlarmClient.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/client/impl/ImsAlarmClient.java new file mode 100644 index 00000000..e7300ef7 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/client/impl/ImsAlarmClient.java @@ -0,0 +1,142 @@ +package com.webank.wedatasphere.qualitis.client.impl; + +import com.google.gson.Gson; +import com.webank.wedatasphere.qualitis.client.AlarmClient; +import com.webank.wedatasphere.qualitis.config.ImsConfig; +import com.webank.wedatasphere.qualitis.entity.ReportBatchInfo; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.http.HttpEntity; +import org.springframework.http.HttpHeaders; +import org.springframework.http.HttpMethod; +import org.springframework.http.MediaType; +import org.springframework.stereotype.Component; +import org.springframework.web.client.RestTemplate; + +import java.util.Arrays; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +/** + * @author allenzhou + */ + +@Component +public class ImsAlarmClient implements AlarmClient { + @Autowired + private ImsConfig imsConfig; + + @Autowired + private RestTemplate restTemplate; + + private static final String IMS_RES_CODE = "resCode"; + private static final Logger LOGGER = LoggerFactory.getLogger(ImsAlarmClient.class); + @Override + public void sendAlarm(String receiver, String alertTitle, String alertInfo, String alertLevel) { + String url = imsConfig.getUrl() + imsConfig.getSendAlarmPath(); + + HttpHeaders headers = new HttpHeaders(); + MediaType type = MediaType.parseMediaType("application/json; charset=UTF-8"); + headers.setContentType(type); + headers.add("Accept", MediaType.TEXT_HTML.toString()); + + Map>> outerMap = new HashMap<>(2); + Map innerMap = new HashMap<>(8); + innerMap.put("alert_title", alertTitle); + innerMap.put("sub_system_id", imsConfig.getSystemId()); + innerMap.put("alert_level", alertLevel); + innerMap.put("alert_info", alertInfo); + innerMap.put("alert_way", imsConfig.getAlertWay()); + innerMap.put("alert_reciver", receiver); + outerMap.put("alertList", Arrays.asList(innerMap)); + Gson gson = new Gson(); + HttpEntity entity = new HttpEntity<>(gson.toJson(outerMap), headers); + + LOGGER.info("Start to send ims. url: {}, method: {}, body: {}", url, HttpMethod.POST.name(), entity); + String response; + try { + response = restTemplate.postForObject(url, entity, String.class); + } catch (Exception e) { + LOGGER.error("Failed to send ims alarm. url: {}, method: {}, body: {}", url, HttpMethod.POST.name(), entity, e); + return; + } + LOGGER.info("Succeed to send ims. response: {}", response); + } + + @Override + public void sendNewAlarm(List> requestList) { + String url = imsConfig.getUrl() + imsConfig.getSendAlarmPath(); + + HttpHeaders headers = new HttpHeaders(); + MediaType type = MediaType.parseMediaType("application/json; charset=UTF-8"); + headers.setContentType(type); + headers.add("Accept", MediaType.TEXT_HTML.toString()); + Map requestMap = new HashMap<>(2); + requestMap.put("userAuthKey", imsConfig.getUserAuthKey()); + requestMap.put("alertList", requestList); + Gson gson = new Gson(); + HttpEntity entity = new HttpEntity<>(gson.toJson(requestMap), headers); + + LOGGER.info("Start to send ims. url: {}, method: {}, body: {}", url, HttpMethod.POST.name(), entity); + String response; + try { + response = restTemplate.postForObject(url, entity, String.class); + } catch (Exception e) { + LOGGER.error("Failed to send ims alarm. url: {}, method: {}, body: {}", url, HttpMethod.POST.name(), entity, e); + return; + } + LOGGER.info("Succeed to send ims. response: {}", response); + } + + @Override + public void report(ReportBatchInfo reportBatchInfo) { + String url = imsConfig.getUrl() + imsConfig.getSendReportPath(); + + HttpHeaders headers = new HttpHeaders(); + MediaType type = MediaType.parseMediaType("application/json; charset=UTF-8"); + headers.setContentType(type); + + Gson gson = new Gson(); + HttpEntity entity = new HttpEntity<>(gson.toJson(reportBatchInfo), headers); + + LOGGER.info("Start to send ims rule metric report. url: {}, method: {}, body: {}", url, HttpMethod.POST.name(), entity); + String response; + try { + response = restTemplate.postForObject(url, entity, String.class); + } catch (Exception e) { + LOGGER.error("Failed to send ims alarm rule metric report. url: {}, method: {}, body: {}", url, HttpMethod.POST.name(), entity, e); + return; + } + LOGGER.info("Succeed to send ims rule metric report. response: {}", response); + } + + @Override + public void sendAbnormalDataRecordAlarm(ImsConfig imsConfig, List> data) { + String url = imsConfig.getFullUrlAbnormalDataRecord(); + + HttpHeaders headers = new HttpHeaders(); + MediaType type = MediaType.parseMediaType("application/json; charset=UTF-8"); + headers.setContentType(type); + Map map = new HashMap<>(2); + map.put("userAuthKey", imsConfig.getUserAuthKey()); + map.put("data", data); + Gson gson = new Gson(); + HttpEntity entity = new HttpEntity<>(gson.toJson(map), headers); + + LOGGER.info("Start to send abnormal data record ims. url: {}, method: {}, body: {}", url, HttpMethod.POST.name(), entity); + Map response; + try { + response = restTemplate.postForObject(url, entity, Map.class); + } catch (Exception e) { + LOGGER.error("Failed to send abnormal data record ims. url: {}, method: {}, body: {}", url, HttpMethod.POST.name(), entity, e); + return; + } + if (0 == Integer.parseInt(response.get(IMS_RES_CODE).toString())) { + LOGGER.info("Success to send abnormal data record ims. response: {}", response); + } else { + LOGGER.error("Failed to send abnormal data record ims. response: {}", response); + } + } +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/config/ImsConfig.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/config/ImsConfig.java new file mode 100644 index 00000000..8750a529 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/config/ImsConfig.java @@ -0,0 +1,165 @@ +package com.webank.wedatasphere.qualitis.config; + +import org.springframework.beans.factory.annotation.Value; +import org.springframework.context.annotation.Configuration; + +/** + * @author allenzhou + */ +@Configuration +public class ImsConfig { + @Value("${alarm.ims.url}") + private String url; + @Value("${alarm.ims.send_alarm_path}") + private String sendAlarmPath; + @Value("${alarm.ims.send_report_path}") + private String sendReportPath; + @Value("${alarm.ims.full_url_abnormal_data_record}") + private String fullUrlAbnormalDataRecord; + @Value("${alarm.ims.system_id}") + private String systemId; + @Value("${alarm.ims.alert_way}") + private String alertWay; + @Value("${alarm.ims.title_prefix}") + private String titlePrefix; + @Value("${alarm.ims.new_title_prefix}") + private String newTitlePrefix; + @Value("${alarm.ims.userAuthKey}") + private String userAuthKey; + /** + * 运维人员 + */ + @Value("${alarm.ims.receiver.fail}") + private String failReceiver; + @Value("${alarm.ims.receiver.out_threshold}") + private String outThresholdReceiver; + /** + * 任务运行时长超长告警运维人员 + */ + @Value("${alarm.ims.receiver.task_time_out}") + private String taskTimeOutReceiver; + /** + * 任务运行时长超长告警标题 + */ + @Value("${alarm.ims.task_time_out.alarm_title}") + private String taskTimeOutAlarmTitle; + + @Value("${alarm.ims.new_title_succeed_prefix}") + private String newTitleSucceedPrefix; + + public ImsConfig() { + // 默认构造函数 + } + + public String getUrl() { + return url; + } + + public void setUrl(String url) { + this.url = url; + } + + public String getSendAlarmPath() { + return sendAlarmPath; + } + + public void setSendAlarmPath(String sendAlarmPath) { + this.sendAlarmPath = sendAlarmPath; + } + + public String getSendReportPath() { + return sendReportPath; + } + + public void setSendReportPath(String sendReportPath) { + this.sendReportPath = sendReportPath; + } + + public String getFullUrlAbnormalDataRecord() { + return fullUrlAbnormalDataRecord; + } + + public void setFullUrlAbnormalDataRecord(String fullUrlAbnormalDataRecord) { + this.fullUrlAbnormalDataRecord = fullUrlAbnormalDataRecord; + } + + public String getSystemId() { + return systemId; + } + + public void setSystemId(String systemId) { + this.systemId = systemId; + } + + public String getAlertWay() { + return alertWay; + } + + public void setAlertWay(String alertWay) { + this.alertWay = alertWay; + } + + public String getTitlePrefix() { + return titlePrefix; + } + + public void setTitlePrefix(String titlePrefix) { + this.titlePrefix = titlePrefix; + } + + public String getNewTitlePrefix() { + return newTitlePrefix; + } + + public void setNewTitlePrefix(String newTitlePrefix) { + this.newTitlePrefix = newTitlePrefix; + } + + public String getFailReceiver() { + return failReceiver; + } + + public void setFailReceiver(String failReceiver) { + this.failReceiver = failReceiver; + } + + public String getOutThresholdReceiver() { + return outThresholdReceiver; + } + + public void setOutThresholdReceiver(String outThresholdReceiver) { + this.outThresholdReceiver = outThresholdReceiver; + } + + public String getTaskTimeOutReceiver() { + return taskTimeOutReceiver; + } + + public void setTaskTimeOutReceiver(String taskTimeOutReceiver) { + this.taskTimeOutReceiver = taskTimeOutReceiver; + } + + public String getTaskTimeOutAlarmTitle() { + return taskTimeOutAlarmTitle; + } + + public void setTaskTimeOutAlarmTitle(String taskTimeOutAlarmTitle) { + this.taskTimeOutAlarmTitle = taskTimeOutAlarmTitle; + } + + public String getUserAuthKey() { + return userAuthKey; + } + + public void setUserAuthKey(String userAuthKey) { + this.userAuthKey = userAuthKey; + } + + public String getNewTitleSucceedPrefix() { + return newTitleSucceedPrefix; + } + + public void setNewTitleSucceedPrefix(String newTitleSucceedPrefix) { + this.newTitleSucceedPrefix = newTitleSucceedPrefix; + } +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/constant/AlertTypeEnum.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/constant/AlertTypeEnum.java new file mode 100644 index 00000000..f3c2baeb --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/constant/AlertTypeEnum.java @@ -0,0 +1,39 @@ +package com.webank.wedatasphere.qualitis.constant; + +/** + * @author allenzhou + */ +public enum AlertTypeEnum { + /** + * 告警消息类型 + */ + TASK_TIME_OUT(1, "Task time out", "任务运行时间过长"), + TASK_FAILED(2, "Task failed", "任务执行失败"), + TASK_FAIL_CHECKOUT(3, "Task failed to checkout", "任务执行成功,未通过阈值校验"), + TASK_INIT_FAIL(4, "Task failed to initial", "任务初始化失败"), + TASK_SUCCESS(5, "Task success", "任务执行成功") + ; + + private int code; + private String status; + private String message; + + AlertTypeEnum(int code, String status, String message) { + this.code = code; + this.status = status; + this.message = message; + } + + public int getCode() { + return code; + } + + public String getStatus() { + return status; + } + + public String getMessage() { + return message; + } + +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/constant/ImsLevelEnum.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/constant/ImsLevelEnum.java new file mode 100644 index 00000000..e89d2f99 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/constant/ImsLevelEnum.java @@ -0,0 +1,26 @@ +package com.webank.wedatasphere.qualitis.constant; + +/** + * @author allenzhou + */ +public enum ImsLevelEnum { + /** + * ims告警级别 + * 1CRITICAL 2MAJOR 3MINOR 4WARNING 5INFO + */ + CRITICAL("1"), + MAJOR("2"), + MINOR("3"), + WARNING("4"), + INFO("5"); + + private String code; + + ImsLevelEnum(String code) { + this.code = code; + } + + public String getCode() { + return code; + } +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/AbnormalDataRecordInfoDao.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/AbnormalDataRecordInfoDao.java new file mode 100644 index 00000000..1691b01f --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/AbnormalDataRecordInfoDao.java @@ -0,0 +1,47 @@ +package com.webank.wedatasphere.qualitis.dao; + +import com.webank.wedatasphere.qualitis.entity.AbnormalDataRecordInfo; +import java.util.Date; +import java.util.List; + +/** + * @author allenzhou + */ +public interface AbnormalDataRecordInfoDao { + /** + * Save. + * @param abnormalDataRecordInfo + * @return + */ + AbnormalDataRecordInfo save(AbnormalDataRecordInfo abnormalDataRecordInfo); + + /** + * 批量保存 + * @param abnormalDataRecordInfos 要保存的参数 + * @return 保存成功的返回值 + */ + List saveAll(List abnormalDataRecordInfos); + + /** + * Find with primary key. + * @param ruleId + * @param dbName + * @param tableName + * @param date + * @return + */ + AbnormalDataRecordInfo findByPrimary(Long ruleId, String dbName, String tableName, String date); + + /** + * Find all. + * @return + */ + List findAll(); + + /** + * Find with rule by record date. + * @param recordDate + * @return + */ + List findWithExistRulesByRecordDate(String recordDate); +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/AlarmInfoDao.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/AlarmInfoDao.java new file mode 100644 index 00000000..65f4cdfe --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/AlarmInfoDao.java @@ -0,0 +1,81 @@ +package com.webank.wedatasphere.qualitis.dao; + +import com.webank.wedatasphere.qualitis.entity.AlarmInfo; +import java.util.List; +import java.util.Map; + +/** + * @author allenzhou + */ +public interface AlarmInfoDao { + /** + * 保存alarmInfo + * @param alarmInfo 要保存的参数 + * @return 保存成功的返回值 + */ + AlarmInfo save(AlarmInfo alarmInfo); + + /** + * 用户在时间段接收的告警分页信息 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @param page 页码 + * @param size 一页获取数量 + * @return 用户在时间段接收的告警分页信息 + */ + List findAllByUsernameAndAlarmTimeBetweenPage(String username, String startAlarmTime, String endAlarmTime, int page, int size); + + /** + * 用户在时间段接收的告警信息 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @return 用户在时间段接收的告警信息 + */ + List findAllByUsernameAndAlarmTimeBetween(String username, String startAlarmTime, + String endAlarmTime); + + /** + * 用户在时间段接收的告警信息 + * @param username + * @param startAlarmTime + * @param endAlarmTime + * @return + */ + List> findAllByUsernameAndAlarmTimeBetweenPerDay(String username, String startAlarmTime, String endAlarmTime); + + /** + * 用户在时间段接收的告警信息总数 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @return 用户在时间段接收的告警信息总数 + */ + long countByUsernameAndAlarmTimeBetween(String username, String startAlarmTime, String endAlarmTime); + + /** + * 用户在时间段接收的告警信息按级别做数量统计 + * @param username + * @param startAlarmTime + * @param endAlarmTime + * @param level + * @return + */ + long countByUsernameAndAlarmTimeBetweenAndAlarmLevel(String username, String startAlarmTime, String endAlarmTime, String level); + + /** + * 查找是否告警 + * @param taskId 任务ID + * @param alarmType 告警类型 + * @return 是否存在 + */ + boolean existsByTaskIdAndAlarmType(Integer taskId, int alarmType); + + /** + * 批量保存alarmInfo + * @param alarmInfos 要保存的参数 + * @return 保存成功的返回值 + */ + List saveAll(List alarmInfos); +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/UploadRecordDao.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/UploadRecordDao.java new file mode 100644 index 00000000..3eca57a4 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/UploadRecordDao.java @@ -0,0 +1,32 @@ +package com.webank.wedatasphere.qualitis.dao; + +import com.webank.wedatasphere.qualitis.entity.UploadRecord; +import java.util.Date; +import java.util.List; + +/** + * @author allenzhou + */ +public interface UploadRecordDao { + /** + * Save. + * @param uploadRecord + * @return + */ + UploadRecord save(UploadRecord uploadRecord); + + /** + * Find with unique keys(status, upload date). + * @param recordDate + * @param status + * @return + */ + UploadRecord findByUnique(Date recordDate, Boolean status); + + /** + * Find all. + * @return + */ + List findAll(); + +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/AbnormalDataRecordInfoDaoImpl.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/AbnormalDataRecordInfoDaoImpl.java new file mode 100644 index 00000000..083ae973 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/AbnormalDataRecordInfoDaoImpl.java @@ -0,0 +1,44 @@ +package com.webank.wedatasphere.qualitis.dao.impl; + +import com.webank.wedatasphere.qualitis.dao.AbnormalDataRecordInfoDao; +import com.webank.wedatasphere.qualitis.dao.repository.AbnormalDataRecordInfoRepository; +import com.webank.wedatasphere.qualitis.entity.AbnormalDataRecordInfo; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Repository; +import java.util.Date; +import java.util.List; + +/** + * @author allenzhou@webank.com + * @date 2021/10/5 12:20 + */ +@Repository +public class AbnormalDataRecordInfoDaoImpl implements AbnormalDataRecordInfoDao { + @Autowired + private AbnormalDataRecordInfoRepository abnormalDataRecordInfoRepository; + + @Override + public AbnormalDataRecordInfo save(AbnormalDataRecordInfo abnormalDataRecordInfo) { + return abnormalDataRecordInfoRepository.save(abnormalDataRecordInfo); + } + + @Override + public List saveAll(List abnormalDataRecordInfos) { + return abnormalDataRecordInfoRepository.saveAll(abnormalDataRecordInfos); + } + + @Override + public AbnormalDataRecordInfo findByPrimary(Long ruleId, String dbName, String tableName, String recordDate) { + return abnormalDataRecordInfoRepository.findByPrimary(ruleId, dbName, tableName, recordDate); + } + + @Override + public List findAll() { + return abnormalDataRecordInfoRepository.findAll(); + } + + @Override + public List findWithExistRulesByRecordDate(String recordDate) { + return abnormalDataRecordInfoRepository.findWithExistRulesByRecordDate(recordDate); + } +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/AlarmInfoDaoImpl.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/AlarmInfoDaoImpl.java new file mode 100644 index 00000000..e37153ad --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/AlarmInfoDaoImpl.java @@ -0,0 +1,108 @@ +package com.webank.wedatasphere.qualitis.dao.impl; + +import com.webank.wedatasphere.qualitis.dao.AlarmInfoDao; +import com.webank.wedatasphere.qualitis.dao.repository.AlarmInfoRepository; +import com.webank.wedatasphere.qualitis.entity.AlarmInfo; +import java.util.List; +import java.util.Map; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.domain.PageRequest; +import org.springframework.data.domain.Pageable; +import org.springframework.data.domain.Sort; +import org.springframework.stereotype.Repository; + +/** + * @author allenzhou + */ +@Repository +public class AlarmInfoDaoImpl implements AlarmInfoDao { + @Autowired + private AlarmInfoRepository alarmInfoRepository; + + @Override + public AlarmInfo save(AlarmInfo alarmInfo) { + return alarmInfoRepository.save(alarmInfo); + } + + /** + * 用户在时间段接收的告警信息 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @param page 页码 + * @param size 一页获取条数 + * @return List 告警信息列表 + */ + @Override + public List findAllByUsernameAndAlarmTimeBetweenPage(String username, String startAlarmTime, String endAlarmTime, int page, int size){ + Sort sort = Sort.by(Sort.Direction.DESC, "alarmTime"); + Pageable pageable = PageRequest.of(page, size, sort); + return alarmInfoRepository.findAllByUsernameAndAlarmTimeBetween(username, startAlarmTime, endAlarmTime, pageable); + } + + /** + * 用户在时间段接收的告警信息 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @return List 告警信息列表 + */ + @Override + public List findAllByUsernameAndAlarmTimeBetween(String username, String startAlarmTime, String endAlarmTime){ + return alarmInfoRepository.findAllByUsernameAndAlarmTimeBetween(username, startAlarmTime, endAlarmTime); + } + + /** + * 用户在时间段接收的告警信息 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @return 每天的各级别告警信息列表 + */ + @Override + public List> findAllByUsernameAndAlarmTimeBetweenPerDay(String username, String startAlarmTime, String endAlarmTime){ + return alarmInfoRepository.findAllByUsernameAndAlarmTimeBetweenPerDay(username, startAlarmTime, endAlarmTime); + } + + /** + * 用户在时间段接收的告警信息数量 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @return int 告警信息列表数量 + */ + @Override + public long countByUsernameAndAlarmTimeBetween(String username, String startAlarmTime, + String endAlarmTime) { + return alarmInfoRepository.countByUsernameAndAlarmTimeBetween(username, startAlarmTime, endAlarmTime); + } + + /** + * 用户在时间段接收的告警信息按级别做数量统计 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @param level + * @return int 告警信息列表数量 + */ + @Override + public long countByUsernameAndAlarmTimeBetweenAndAlarmLevel(String username, String startAlarmTime, String endAlarmTime, String level) { + return alarmInfoRepository.countByUsernameAndAlarmTimeBetweenAndAlarmLevel(username, startAlarmTime, endAlarmTime, level); + } + + /** + * 查找告警条数 + * @param taskId 任务ID + * @param alarmType 告警类型 + * @return 条数 + */ + @Override + public boolean existsByTaskIdAndAlarmType(Integer taskId, int alarmType){ + return alarmInfoRepository.existsByTaskIdAndAlarmType(taskId, alarmType); + } + + @Override + public List saveAll(List alarmInfos) { + return alarmInfoRepository.saveAll(alarmInfos); + } +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/UploadRecordDaoImpl.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/UploadRecordDaoImpl.java new file mode 100644 index 00000000..204bf19d --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/UploadRecordDaoImpl.java @@ -0,0 +1,34 @@ +package com.webank.wedatasphere.qualitis.dao.impl; + +import com.webank.wedatasphere.qualitis.dao.UploadRecordDao; +import com.webank.wedatasphere.qualitis.dao.repository.UploadRecordRepository; +import org.springframework.beans.factory.annotation.Autowired; +import com.webank.wedatasphere.qualitis.entity.UploadRecord; +import org.springframework.stereotype.Repository; +import java.util.Date; +import java.util.List; + +/** + * @author allenzhou@webank.com + * @date 2021/10/25 14:10 + */ +@Repository +public class UploadRecordDaoImpl implements UploadRecordDao { + @Autowired + private UploadRecordRepository uploadRecordRepository; + + @Override + public UploadRecord save(UploadRecord uploadRecord) { + return uploadRecordRepository.save(uploadRecord); + } + + @Override + public UploadRecord findByUnique(Date recordDate, Boolean status) { + return uploadRecordRepository.findByUnique(recordDate, status); + } + + @Override + public List findAll() { + return uploadRecordRepository.findAll(); + } +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/AbnormalDataRecordInfoRepository.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/AbnormalDataRecordInfoRepository.java new file mode 100644 index 00000000..24ba0eb6 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/AbnormalDataRecordInfoRepository.java @@ -0,0 +1,32 @@ +package com.webank.wedatasphere.qualitis.dao.repository; + +import com.webank.wedatasphere.qualitis.entity.AbnormalDataRecordInfo; +import com.webank.wedatasphere.qualitis.entity.AbnormalDataRecordPrimaryKey; +import java.util.Date; +import org.springframework.data.jpa.repository.JpaRepository; +import org.springframework.data.jpa.repository.Query; +import java.util.List; + +/** + * @author allenzhou + */ +public interface AbnormalDataRecordInfoRepository extends JpaRepository { + /** + * Find one by primary keys. + * @param ruleId + * @param dbName + * @param tableName + * @param recordDate + * @return + */ + @Query(value = "SELECT a FROM AbnormalDataRecordInfo a WHERE a.ruleId = ?1 AND a.dbName = ?2 AND a.tableName = ?3 AND a.recordDate = ?4") + AbnormalDataRecordInfo findByPrimary(Long ruleId, String dbName, String tableName, String recordDate); + + /** + * Find with rule by record date. + * @param recordDate + * @return + */ + @Query(value = "select a from AbnormalDataRecordInfo a where exists (select id from Rule r where r.id = a.ruleId) and a.recordDate = ?1") + List findWithExistRulesByRecordDate(String recordDate); +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/AlarmInfoRepository.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/AlarmInfoRepository.java new file mode 100644 index 00000000..df99ca41 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/AlarmInfoRepository.java @@ -0,0 +1,69 @@ +package com.webank.wedatasphere.qualitis.dao.repository; + +import java.util.List; +import java.util.Map; +import org.springframework.data.domain.Pageable; +import com.webank.wedatasphere.qualitis.entity.AlarmInfo; +import org.springframework.data.jpa.repository.JpaRepository; +import org.springframework.data.jpa.repository.Query; + +/** + * @author allenzhou + */ +public interface AlarmInfoRepository extends JpaRepository { + /** + * 用户在时间段接收的告警信息 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @return List 告警信息列表 + */ + List findAllByUsernameAndAlarmTimeBetween(String username, String startAlarmTime, String endAlarmTime); + + /** + * 用户在时间段接收的告警信息 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @return 每天的各级别告警信息列表 + */ + @Query(value = "select new map(DATE_FORMAT(a.beginTime, '%Y-%m-%d') as alarm_day, a.alarmLevel as alarm_level, count(a.id) as alarm_count) from AlarmInfo a where a.username = ?1 and beginTime BETWEEN ?2 AND ?3 group by DATE_FORMAT(a.beginTime, '%Y-%m-%d'), a.alarmLevel") + List> findAllByUsernameAndAlarmTimeBetweenPerDay(String username, String startAlarmTime, String endAlarmTime); + + /** + * 用户在时间段接收的告警信息 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @param pageable 分页参数 + * @return List 告警信息列表 + */ + List findAllByUsernameAndAlarmTimeBetween(String username, String startAlarmTime, String endAlarmTime, Pageable pageable); + + /** + * 用户在时间段接收的告警信息按级别做数量统计 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @param alarmLevel + * @return int 告警信息列表数量 + */ + long countByUsernameAndAlarmTimeBetweenAndAlarmLevel(String username, String startAlarmTime, String endAlarmTime, String alarmLevel); + + /** + * 用户在时间段接收的告警信息数量 + * @param username 用户 + * @param startAlarmTime 开始时间.yyyy-MM-dd HH:mm:ss + * @param endAlarmTime 结束时间.yyyy-MM-dd HH:mm:ss + * @return int 告警信息列表数量 + */ + long countByUsernameAndAlarmTimeBetween(String username, String startAlarmTime, String endAlarmTime); + + /** + * 查找告警条数 + * @param taskId 任务ID + * @param alarmType 告警类型 + * @return 条数 + */ + boolean existsByTaskIdAndAlarmType(Integer taskId, int alarmType); +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/UploadRecordRepository.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/UploadRecordRepository.java new file mode 100644 index 00000000..c22b0da6 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/UploadRecordRepository.java @@ -0,0 +1,20 @@ +package com.webank.wedatasphere.qualitis.dao.repository; + +import com.webank.wedatasphere.qualitis.entity.UploadRecord; +import org.springframework.data.jpa.repository.JpaRepository; +import org.springframework.data.jpa.repository.Query; +import java.util.Date; + +/** + * @author allenzhou + */ +public interface UploadRecordRepository extends JpaRepository { + /** + * Find one by unique keys(status, upload date). + * @param recordDate + * @param status + * @return + */ + @Query(value = "SELECT ur FROM UploadRecord ur WHERE ur.uploadDate = ?1 and ur.status = ?2") + UploadRecord findByUnique(Date recordDate, Boolean status); +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/AbnormalDataRecordInfo.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/AbnormalDataRecordInfo.java new file mode 100644 index 00000000..3911ee65 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/AbnormalDataRecordInfo.java @@ -0,0 +1,206 @@ +package com.webank.wedatasphere.qualitis.entity; + +import java.util.Objects; +import javax.persistence.Id; +import javax.persistence.Column; +import javax.persistence.Entity; +import javax.persistence.IdClass; +import javax.persistence.Table; + +/** + * @author allenzhou@webank.com + * @date 2021/10/5 11:10 + */ +@Entity +@Table(name = "qualitis_abnormal_data_record_info") +@IdClass(AbnormalDataRecordPrimaryKey.class) +public class AbnormalDataRecordInfo { + @Id + @Column(name = "rule_id") + private Long ruleId; + @Column(name = "rule_name") + private String ruleName; + @Column(name = "rule_detail") + private String ruleDetail; + @Column(name = "datasource") + private String datasource; + @Id + @Column(name = "db_name") + private String dbName; + @Id + @Column(name = "table_name") + private String tableName; + @Column(name = "dept") + private String departmentName; + @Column(name = "sub_system_id") + private Integer subSystemId; + @Column(name = "execute_num") + private Integer executeNum; + @Column(name = "event_num") + private Integer eventNum; + @Id + @Column(name = "record_date") + private String recordDate; + @Column(name = "record_time") + private String recordTime; + + public AbnormalDataRecordInfo() { + // Do nothing. + } + + public AbnormalDataRecordInfo(Long ruleId, String ruleName, String datasourceType, String dbName, String tableName, String departmentName, Integer subSystemId, int execNum, int alarmNum) { + this.ruleId = ruleId; + this.ruleName = ruleName; + this.datasource = datasourceType; + this.dbName = dbName; + this.tableName = tableName; + this.departmentName = departmentName; + this.subSystemId = subSystemId; + this.executeNum = execNum; + this.eventNum = alarmNum; + } + + public Long getRuleId() { + return ruleId; + } + + public void setRuleId(Long ruleId) { + this.ruleId = ruleId; + } + + public String getRuleName() { + return ruleName; + } + + public void setRuleName(String ruleName) { + this.ruleName = ruleName; + } + + public String getRuleDetail() { + return ruleDetail; + } + + public void setRuleDetail(String ruleDetail) { + this.ruleDetail = ruleDetail; + } + + public String getDatasource() { + return datasource; + } + + public void setDatasource(String datasource) { + this.datasource = datasource; + } + + public String getDbName() { + return dbName; + } + + public void setDbName(String dbName) { + this.dbName = dbName; + } + + public String getTableName() { + return tableName; + } + + public void setTableName(String tableName) { + this.tableName = tableName; + } + + public String getDepartmentName() { + return departmentName; + } + + public void setDepartmentName(String departmentName) { + this.departmentName = departmentName; + } + + public Integer getSubSystemId() { + return subSystemId; + } + + public void setSubSystemId(Integer subSystemId) { + this.subSystemId = subSystemId; + } + + public Integer getExecuteNum() { + return executeNum; + } + + public void setExecuteNum(Integer executeNum) { + this.executeNum = executeNum; + } + + public Integer getEventNum() { + return eventNum; + } + + public void setEventNum(Integer eventNum) { + this.eventNum = eventNum; + } + + public String getRecordDate() { + return recordDate; + } + + public void setRecordDate(String recordDate) { + this.recordDate = recordDate; + } + + public String getRecordTime() { + return recordTime; + } + + public void setRecordTime(String recordTime) { + this.recordTime = recordTime; + } + + @Override + public boolean equals(Object o) { + if (this == o) { + return true; + } + if (o == null || getClass() != o.getClass()) { + return false; + } + AbnormalDataRecordInfo that = (AbnormalDataRecordInfo) o; + return Objects.equals(ruleId, that.ruleId) && + Objects.equals(ruleName, that.ruleName) && + Objects.equals(ruleDetail, that.ruleDetail) && + Objects.equals(datasource, that.datasource) && + Objects.equals(dbName, that.dbName) && + Objects.equals(tableName, that.tableName) && + Objects.equals(departmentName, that.departmentName) && + Objects.equals(subSystemId, that.subSystemId) && + Objects.equals(executeNum, that.executeNum) && + Objects.equals(eventNum, that.eventNum) && + Objects.equals(recordDate, that.recordDate) && + Objects.equals(recordTime, that.recordTime); + } + + @Override + public int hashCode() { + return Objects + .hash(ruleId, ruleName, ruleDetail, datasource, dbName, tableName, departmentName, subSystemId, executeNum, eventNum, recordDate, + recordTime); + } + + @Override + public String toString() { + return "AbnormalDataRecordInfo{" + + "ruleId=" + ruleId + + ", ruleName='" + ruleName + '\'' + + ", ruleDetail='" + ruleDetail + '\'' + + ", datasource='" + datasource + '\'' + + ", dbName='" + dbName + '\'' + + ", tableName='" + tableName + '\'' + + ", departmentName='" + departmentName + '\'' + + ", subSystemId=" + subSystemId + + ", executeNum=" + executeNum + + ", eventNum=" + eventNum + + ", recordDate=" + recordDate + + ", recordTime=" + recordTime + + '}'; + } +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/AbnormalDataRecordPrimaryKey.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/AbnormalDataRecordPrimaryKey.java new file mode 100644 index 00000000..2d390bfe --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/AbnormalDataRecordPrimaryKey.java @@ -0,0 +1,72 @@ +package com.webank.wedatasphere.qualitis.entity; + +import java.io.Serializable; +import java.util.Objects; +import java.util.Date; + +/** + * @author allenzhou@webank.com + * @date 2021/10/7 12:49 + */ +public class AbnormalDataRecordPrimaryKey implements Serializable { + private Long ruleId; + private String dbName; + private String tableName; + private String recordDate; + + public AbnormalDataRecordPrimaryKey() { + // Do nothing. + } + + public Long getRuleId() { + return ruleId; + } + + public void setRuleId(Long ruleId) { + this.ruleId = ruleId; + } + + public String getDbName() { + return dbName; + } + + public void setDbName(String dbName) { + this.dbName = dbName; + } + + public String getTableName() { + return tableName; + } + + public void setTableName(String tableName) { + this.tableName = tableName; + } + + public String getRecordDate() { + return recordDate; + } + + public void setRecordDate(String recordDate) { + this.recordDate = recordDate; + } + + @Override + public boolean equals(Object o) { + if (this == o) { + return true; + } + if (o == null || getClass() != o.getClass()) { + return false; + } + AbnormalDataRecordPrimaryKey that = (AbnormalDataRecordPrimaryKey) o; + return Objects.equals(ruleId, that.ruleId) && + Objects.equals(dbName, that.dbName) && + Objects.equals(tableName, that.tableName) && + Objects.equals(recordDate, that.recordDate); + } + + @Override + public int hashCode() { + return Objects.hash(ruleId, dbName, tableName, recordDate); + } +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/AlarmInfo.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/AlarmInfo.java new file mode 100644 index 00000000..47a46aa0 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/AlarmInfo.java @@ -0,0 +1,148 @@ +package com.webank.wedatasphere.qualitis.entity; + +import javax.persistence.Column; +import javax.persistence.Entity; +import javax.persistence.GeneratedValue; +import javax.persistence.GenerationType; +import javax.persistence.Id; +import javax.persistence.Table; +import org.springframework.beans.BeanUtils; + +/** + * @author allenzhou + */ +@Entity +@Table(name = "qualitis_alarm_info") +public class AlarmInfo { + @Id + @GeneratedValue(strategy = GenerationType.IDENTITY) + private Long id; + @Column(name = "alarm_level", length = 1) + private String alarmLevel; + @Column(name = "alarm_reason", columnDefinition = "TEXT") + private String alarmReason; + @Column(name = "application_id", length = 40) + private String applicationId; + @Column(name = "begin_time", length = 20) + private String beginTime; + @Column(name = "end_time", length = 20) + private String endTime; + @Column(name = "alarm_time", length = 20) + private String alarmTime; + @Column(length = 50) + private String username; + @Column(name = "alarm_type") + private Integer alarmType; + @Column(name = "task_id") + private Integer taskId; + @Column(name = "project_name") + private String projectName; + + public AlarmInfo() { + } + + public AlarmInfo(AlarmInfo alarmInfo) { + BeanUtils.copyProperties(alarmInfo, this); + } + + public AlarmInfo(String alarmLevel, String alarmReason, String applicationId, String beginTime, String endTime, String alarmTime, + String username, int alarmType, String projectName) { + this.alarmLevel = alarmLevel; + this.alarmReason = alarmReason; + this.applicationId = applicationId; + this.beginTime = beginTime; + this.endTime = endTime; + this.alarmTime = alarmTime; + this.username = username; + this.alarmType = alarmType; + this.projectName = projectName; + } + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public String getAlarmLevel() { + return alarmLevel; + } + + public void setAlarmLevel(String alarmLevel) { + this.alarmLevel = alarmLevel; + } + + public String getAlarmReason() { + return alarmReason; + } + + public void setAlarmReason(String alarmReason) { + this.alarmReason = alarmReason; + } + + public String getApplicationId() { + return applicationId; + } + + public void setApplicationId(String applicationId) { + this.applicationId = applicationId; + } + + public String getBeginTime() { + return beginTime; + } + + public void setBeginTime(String beginTime) { + this.beginTime = beginTime; + } + + public String getEndTime() { + return endTime; + } + + public void setEndTime(String endTime) { + this.endTime = endTime; + } + + public String getAlarmTime() { + return alarmTime; + } + + public void setAlarmTime(String alarmTime) { + this.alarmTime = alarmTime; + } + + public String getUsername() { + return username; + } + + public void setUsername(String username) { + this.username = username; + } + + public Integer getAlarmType() { + return alarmType; + } + + public void setAlarmType(Integer alarmType) { + this.alarmType = alarmType; + } + + public Integer getTaskId() { + return taskId; + } + + public void setTaskId(Integer taskId) { + this.taskId = taskId; + } + + public String getProjectName() { + return projectName; + } + + public void setProjectName(String projectName) { + this.projectName = projectName; + } +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/MetricData.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/MetricData.java new file mode 100644 index 00000000..a86d7824 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/MetricData.java @@ -0,0 +1,86 @@ +package com.webank.wedatasphere.qualitis.entity; + +/** + * @author allenzhou@webank.com + * @date 2021/4/27 15:05 + */ +public class MetricData { + private String subsystemId; + private String interfaceName; + private String attrGroup; + private String attrName; + private String collectTimestamp; + private String metricValue; + private String hostIp; + + public MetricData(String subsystemId, String interfaceName, String attrGroup, String attrName, String collectTime, String metricValue, + String hostIp) { + this.subsystemId = subsystemId; + this.interfaceName = interfaceName; + this.attrGroup = attrGroup; + this.attrName = attrName; + this.collectTimestamp = collectTime; + this.metricValue = metricValue; + this.hostIp = hostIp; + } + + public MetricData() { + + } + + public String getSubsystemId() { + return subsystemId; + } + + public void setSubsystemId(String subsystemId) { + this.subsystemId = subsystemId; + } + + public String getInterfaceName() { + return interfaceName; + } + + public void setInterfaceName(String interfaceName) { + this.interfaceName = interfaceName; + } + + public String getAttrGroup() { + return attrGroup; + } + + public void setAttrGroup(String attrGroup) { + this.attrGroup = attrGroup; + } + + public String getAttrName() { + return attrName; + } + + public void setAttrName(String attrName) { + this.attrName = attrName; + } + + public String getCollectTimestamp() { + return collectTimestamp; + } + + public void setCollectTimestamp(String collectTimestamp) { + this.collectTimestamp = collectTimestamp; + } + + public String getMetricValue() { + return metricValue; + } + + public void setMetricValue(String metricValue) { + this.metricValue = metricValue; + } + + public String getHostIp() { + return hostIp; + } + + public void setHostIp(String hostIp) { + this.hostIp = hostIp; + } +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/ReportBatchInfo.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/ReportBatchInfo.java new file mode 100644 index 00000000..e270b7b5 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/ReportBatchInfo.java @@ -0,0 +1,32 @@ +package com.webank.wedatasphere.qualitis.entity; + +import java.util.List; + +/** + * @author allenzhou@webank.com + * @date 2021/4/26 20:55 + */ +public class ReportBatchInfo { + private String userAuthKey; + private List metricDataList; + + public ReportBatchInfo() { + // Do nothing. + } + + public String getUserAuthKey() { + return userAuthKey; + } + + public void setUserAuthKey(String userAuthKey) { + this.userAuthKey = userAuthKey; + } + + public List getMetricDataList() { + return metricDataList; + } + + public void setMetricDataList(List metricDataList) { + this.metricDataList = metricDataList; + } +} diff --git a/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/UploadRecord.java b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/UploadRecord.java new file mode 100644 index 00000000..a61cc414 --- /dev/null +++ b/core/alarm/src/main/java/com/webank/wedatasphere/qualitis/entity/UploadRecord.java @@ -0,0 +1,91 @@ +package com.webank.wedatasphere.qualitis.entity; + +import java.util.Date; +import javax.persistence.Column; +import javax.persistence.Entity; +import javax.persistence.GeneratedValue; +import javax.persistence.GenerationType; +import javax.persistence.Table; +import javax.persistence.Id; + +/** + * @author allenzhou@webank.com + * @date 2021/10/5 11:10 + */ +@Entity +@Table(name = "qualitis_upload_record") +public class UploadRecord { + @Id + @GeneratedValue(strategy = GenerationType.IDENTITY) + private Long id; + + @Column(name = "ims_rule_count") + private Integer imsRuleCount; + @Column(name = "upload_status") + private Boolean status; + @Column(name = "upload_date") + private Date uploadDate; + @Column(name = "upload_time") + private String uploadTime; + @Column(name = "upload_err_msg", columnDefinition = "TEXT") + private String errMsg; + + public UploadRecord() { + } + + public UploadRecord(Integer imsRuleCount, Boolean status, Date uploadDate, String uploadTime, String errMsg) { + this.imsRuleCount = imsRuleCount; + this.status = status; + this.uploadDate = uploadDate; + this.uploadTime = uploadTime; + this.errMsg = errMsg; + } + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public Integer getImsRuleCount() { + return imsRuleCount; + } + + public void setImsRuleCount(Integer imsRuleCount) { + this.imsRuleCount = imsRuleCount; + } + + public Boolean getStatus() { + return status; + } + + public void setStatus(Boolean status) { + this.status = status; + } + + public Date getUploadDate() { + return uploadDate; + } + + public void setUploadDate(Date uploadDate) { + this.uploadDate = uploadDate; + } + + public String getUploadTime() { + return uploadTime; + } + + public void setUploadTime(String uploadTime) { + this.uploadTime = uploadTime; + } + + public String getErrMsg() { + return errMsg; + } + + public void setErrMsg(String errMsg) { + this.errMsg = errMsg; + } +} diff --git a/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/TaskResultDao.java b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/TaskResultDao.java index fc93adae..4c58c706 100644 --- a/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/TaskResultDao.java +++ b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/TaskResultDao.java @@ -37,9 +37,22 @@ public interface TaskResultDao { * @param begin * @param end * @param ruleId + * @param ruleMetricId + * @param applicationId + * @return + */ + Double findAvgByCreateTimeBetweenAndRuleAndMetricAndApplication(String begin, String end, Long ruleId, Long ruleMetricId, String applicationId); + + /** + * Count value from begin time and end time + * @param begin + * @param end + * @param ruleId + * @param ruleMetricId + * @param applicationId * @return */ - Double findAvgByCreateTimeBetweenAndRule(String begin, String end, Long ruleId); + long countByCreateTimeBetweenAndRuleAndMetricAndApplication(String begin, String end, Long ruleId, Long ruleMetricId, String applicationId); /** * Find task result by application and rule id in @@ -63,16 +76,19 @@ public interface TaskResultDao { * @param size * @return */ - List findRuleByRuleMetric(Long id, int page, int size); + List findRuleIdsByRuleMetric(Long id, int page, int size); /** * Find values by rule ID and rule metric ID. * @param ruleMetricId + * @param startTime + * @param endTime + * @param envName * @param page * @param size * @return */ - List findValuesByRuleMetric(long ruleMetricId, int page, int size); + List findValuesByRuleMetric(long ruleMetricId, String startTime, String endTime, String envName, int page, int size); /** * Find avg value by rule ID and rule metric ID. @@ -84,6 +100,17 @@ public interface TaskResultDao { */ Double findAvgByCreateTimeBetweenAndRuleAndRuleMetric(String start, String end, Long ruleId, Long ruleMetricId); + /** + * Count + * @param format + * @param format1 + * @param ruleId + * @param ruleMetricId + * @param applicationId + * @return + */ + long countByCreateTimeBetweenAndRuleAndRuleMetric(String format, String format1, Long ruleId, Long ruleMetricId, String applicationId); + /** * Find value. * @param applicationId @@ -96,14 +123,33 @@ public interface TaskResultDao { /** * Count values. * @param ruleMetricId + * @param startTime + * @param endTime + * @param envName * @return */ - int countValuesByRuleMetric(long ruleMetricId); + int countValuesByRuleMetric(long ruleMetricId, String startTime, String endTime, String envName); /** * Count rules. * @param ruleMetricId * @return */ - int countRuleByRuleMetric(Long ruleMetricId); + int countRuleIdsByRuleMetric(Long ruleMetricId); + + /** + * Find values with time. + * @param ruleMetricId + * @param startTime + * @param endTime + * @return + */ + List findValuesByRuleMetricWithTime(Long ruleMetricId, String startTime, String endTime); + + /** + * Find all by application ID + * @param applicationId + * @return + */ + List findByApplicationId(String applicationId); } diff --git a/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/TaskResultStatusDao.java b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/TaskResultStatusDao.java new file mode 100644 index 00000000..aba9ff40 --- /dev/null +++ b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/TaskResultStatusDao.java @@ -0,0 +1,28 @@ +package com.webank.wedatasphere.qualitis.dao; + +import com.webank.wedatasphere.qualitis.entity.TaskResultStatus; + +import java.util.List; + +/** + * @author v_minminghe@webank.com + * @date 2022-11-14 10:01 + * @description + */ +public interface TaskResultStatusDao { + + /** + * Finding by applicationId and status and ruleId + * @param applicationId + * @param ruleId + * @param status + * @return + */ + List findByStatus(String applicationId, Long ruleId, Integer status); + + /** + * Saving batch data + * @param taskResultStatusList + */ + void saveBatch(List taskResultStatusList); +} diff --git a/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/TaskResultDaoImpl.java b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/TaskResultDaoImpl.java index e5df8747..9c839ee6 100644 --- a/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/TaskResultDaoImpl.java +++ b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/TaskResultDaoImpl.java @@ -41,37 +41,58 @@ public List findByApplicationAndRule(String applicationId, Long rule } @Override - public Double findAvgByCreateTimeBetweenAndRule(String begin, String end, Long ruleId) { - return resultRepository.findAvgByCreateTimeBetween(begin, end, ruleId); + public List findByApplicationIdAndRuleIn(String applicationId, List ruleIds) { + return resultRepository.findByApplicationIdAndRuleIdIn(applicationId, ruleIds); } @Override - public List findByApplicationIdAndRuleIn(String applicationId, List ruleIds) { - return resultRepository.findByApplicationIdAndRuleIdIn(applicationId, ruleIds); + public Double findAvgByCreateTimeBetweenAndRuleAndMetricAndApplication(String begin, String end, Long ruleId, Long ruleMetricId, String applicationId) { + return resultRepository.findAvgByCreateTimeBetweenAndRuleAndMetricAndApplication(begin, end, ruleId, ruleMetricId, applicationId); } + @Override + public long countByCreateTimeBetweenAndRuleAndMetricAndApplication(String begin, String end, Long ruleId, Long ruleMetricId, String applicationId) { + return resultRepository.countByCreateTimeBetweenAndRuleAndMetricAndApplication(begin, end, ruleId, ruleMetricId, applicationId); + } + + @Override public TaskResult saveTaskResult(TaskResult taskResult) { return resultRepository.save(taskResult); } @Override - public List findRuleByRuleMetric(Long ruleMetricId, int page, int size) { - Sort sort = new Sort(Sort.Direction.ASC, "id"); + public List findRuleIdsByRuleMetric(Long ruleMetricId, int page, int size) { + Sort sort = Sort.by(Sort.Direction.ASC, "id"); Pageable pageable = PageRequest.of(page, size, sort); - return resultRepository.findRuleByRuleMetricId(ruleMetricId, pageable).getContent(); + return resultRepository.findRuleIdsByRuleMetric(ruleMetricId, pageable).getContent(); + } + + @Override + public int countRuleIdsByRuleMetric(Long ruleMetricId) { + return resultRepository.countRuleIdsByRuleMetric(ruleMetricId); } @Override - public List findValuesByRuleMetric(long ruleMetricId, int page, int size) { - Sort sort = new Sort(Sort.Direction.DESC, "id"); + public List findValuesByRuleMetric(long ruleMetricId, String startTime, String endTime, String envName, int page, int size) { + Sort sort = Sort.by(Sort.Direction.DESC, "id"); Pageable pageable = PageRequest.of(page, size, sort); - return resultRepository.findValuesByRuleAndRuleMetric(ruleMetricId, pageable).getContent(); + return resultRepository.findValuesByRuleMetric(ruleMetricId, startTime, endTime, envName, pageable).getContent(); + } + + @Override + public int countValuesByRuleMetric(long ruleMetricId, String startTime, String endTime, String envName) { + return resultRepository.countValuesByRuleMetric(ruleMetricId, startTime, endTime, envName); } @Override public Double findAvgByCreateTimeBetweenAndRuleAndRuleMetric(String start, String end, Long ruleId, Long ruleMetricId) { - return resultRepository.findAvgByCreateTimeBetween(start, end, ruleId, ruleMetricId); + return resultRepository.findAvgByCreateTimeBetweenAndRuleAndRuleMetric(start, end, ruleId, ruleMetricId); + } + + @Override + public long countByCreateTimeBetweenAndRuleAndRuleMetric(String start, String end, Long ruleId, Long ruleMetricId, String applicationId) { + return resultRepository.countByCreateTimeBetweenAndRuleAndRuleMetric(start, end, ruleId, ruleMetricId, applicationId); } @Override @@ -80,12 +101,13 @@ public TaskResult find(String applicationId, Long ruleId, Long ruleMetricId) { } @Override - public int countValuesByRuleMetric(long ruleMetricId) { - return resultRepository.countValuesByRuleMetric(ruleMetricId); + public List findValuesByRuleMetricWithTime(Long ruleMetricId, String startTime, String endTime) { + return resultRepository.findValuesByRuleMetricWithTime(ruleMetricId, startTime, endTime); } @Override - public int countRuleByRuleMetric(Long ruleMetricId) { - return resultRepository.countRulesByRuleMetric(ruleMetricId); + public List findByApplicationId(String applicationId) { + return resultRepository.findByApplicationId(applicationId); } + } diff --git a/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/TaskResultStatusDaoImpl.java b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/TaskResultStatusDaoImpl.java new file mode 100644 index 00000000..aadf9abc --- /dev/null +++ b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/TaskResultStatusDaoImpl.java @@ -0,0 +1,31 @@ +package com.webank.wedatasphere.qualitis.dao.impl; + +import com.webank.wedatasphere.qualitis.dao.TaskResultStatusDao; +import com.webank.wedatasphere.qualitis.dao.repository.TaskResultStatusRepository; +import com.webank.wedatasphere.qualitis.entity.TaskResultStatus; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Repository; + +import java.util.List; + +/** + * @author v_minminghe@webank.com + * @date 2022-11-14 10:01 + * @description + */ +@Repository +public class TaskResultStatusDaoImpl implements TaskResultStatusDao { + + @Autowired + private TaskResultStatusRepository taskResultStatusRepository; + + @Override + public List findByStatus(String applicationId, Long ruleId, Integer status) { + return taskResultStatusRepository.findByApplicationIdAndRuleIdAndStatus(applicationId, ruleId, status); + } + + @Override + public void saveBatch(List taskResultStatusList) { + taskResultStatusRepository.saveAll(taskResultStatusList); + } +} diff --git a/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/TaskResultRepository.java b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/TaskResultRepository.java index eca120fc..aac41353 100644 --- a/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/TaskResultRepository.java +++ b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/TaskResultRepository.java @@ -35,16 +35,38 @@ public interface TaskResultRepository extends JpaRepository { */ List findByApplicationIdAndRuleId(String applicationId, Long ruleId); + /** + * Find task result by application and rule id + * @param applicationId + * @param ruleIds + * @return + */ + List findByApplicationIdAndRuleIdIn(String applicationId, List ruleIds); + /** * Find value avg from begin time and end time * @param begin * @param end * @param ruleId + * @param ruleMetricId + * @param applicationId * @return */ - @Query("select avg(value) from TaskResult t where (t.createTime between ?1 and ?2) and t.ruleId = ?3 and t.saveResult = 1") - Double findAvgByCreateTimeBetween(String begin, String end, Long ruleId); + @Query("select avg(value) from TaskResult t where (t.createTime between ?1 and ?2) and t.ruleId = ?3 and (t.ruleMetricId = ?4) and t.applicationId != ?5 and t.saveResult = 1") + Double findAvgByCreateTimeBetweenAndRuleAndMetricAndApplication(String begin, String end, Long ruleId, Long ruleMetricId, String applicationId); + + /** + * Count value from begin time and end time + * @param begin + * @param end + * @param ruleId + * @param ruleMetricId + * @param applicationId + * @return + */ + @Query("select count(t.id) from TaskResult t where (t.createTime between ?1 and ?2) and t.ruleId = ?3 and (t.ruleMetricId = ?4) and t.applicationId != ?5 and t.saveResult = 1") + long countByCreateTimeBetweenAndRuleAndMetricAndApplication(String begin, String end, Long ruleId, Long ruleMetricId, String applicationId); /** * Find avg value by rule ID and rule metric ID. @@ -55,15 +77,19 @@ public interface TaskResultRepository extends JpaRepository { * @return */ @Query("select avg(value) from TaskResult t where (t.createTime between ?1 and ?2) and t.ruleId = ?3 and (t.ruleMetricId = ?4) and t.saveResult = 1") - Double findAvgByCreateTimeBetween(String begin, String end, Long ruleId, Long ruleMetricId); + Double findAvgByCreateTimeBetweenAndRuleAndRuleMetric(String begin, String end, Long ruleId, Long ruleMetricId); /** - * Find task result by application and rule id + * Count + * @param start + * @param end + * @param ruleId + * @param ruleMetricId * @param applicationId - * @param ruleIds * @return */ - List findByApplicationIdAndRuleIdIn(String applicationId, List ruleIds); + @Query("select count(value) from TaskResult t where (t.createTime between ?1 and ?2) and t.ruleId = ?3 and (t.ruleMetricId = ?4) and t.applicationId != ?5 and t.saveResult = 1") + long countByCreateTimeBetweenAndRuleAndRuleMetric(String start, String end, Long ruleId, Long ruleMetricId, String applicationId); /** * Find rule IDs by rule metric ID. @@ -72,19 +98,41 @@ public interface TaskResultRepository extends JpaRepository { * @return */ @Query(value = "SELECT tr.ruleId from TaskResult tr where tr.ruleMetricId = ?1") - Page findRuleByRuleMetricId(Long id, Pageable pageable); + Page findRuleIdsByRuleMetric(Long id, Pageable pageable); + + /** + * Count rules. + * @param ruleMetricId + * @return + */ + @Query(value = "SELECT count(tr.ruleId) from TaskResult tr where tr.ruleMetricId = ?1") + int countRuleIdsByRuleMetric(Long ruleMetricId); /** * Find values by rule ID and rule metric ID. * @param ruleMetricId + * @param startTime + * @param endTime + * @param envName * @param pageable * @return */ - @Query(value = "SELECT tr from TaskResult tr where tr.ruleMetricId = ?1") - Page findValuesByRuleAndRuleMetric(long ruleMetricId, Pageable pageable); + @Query(value = "SELECT tr from TaskResult tr where tr.ruleMetricId = ?1 and tr.saveResult = 1 and (tr.createTime between ?2 and ?3) and (?4 is null or LENGTH(?4) = 0 or tr.envName = ?4)") + Page findValuesByRuleMetric(long ruleMetricId, String startTime, String endTime, String envName, Pageable pageable); /** - * Find value. + * Count values. + * @param ruleMetricId + * @param startTime + * @param endTime + * @param envName + * @return + */ + @Query(value = "SELECT count(tr.id) from TaskResult tr where tr.ruleMetricId = ?1 and tr.saveResult = 1 and (tr.createTime between ?2 and ?3) and (?4 is null or LENGTH(?4) = 0 or tr.envName = ?4)") + int countValuesByRuleMetric(long ruleMetricId, String startTime, String endTime, String envName); + + /** + * Find value for upload task info and file rule. * @param applicationId * @param ruleId * @param ruleMetricId @@ -94,18 +142,20 @@ public interface TaskResultRepository extends JpaRepository { TaskResult findValue(String applicationId, Long ruleId, Long ruleMetricId); /** - * Count values. + * Find values with time. * @param ruleMetricId + * @param startTime + * @param endTime * @return */ - @Query(value = "SELECT count(tr.id) from TaskResult tr where tr.ruleMetricId = ?1") - int countValuesByRuleMetric(long ruleMetricId); + @Query(value = "SELECT tr from TaskResult tr where tr.ruleMetricId = ?1 and tr.createTime between ?2 and ?3") + List findValuesByRuleMetricWithTime(Long ruleMetricId, String startTime, String endTime); /** - * Count rules. - * @param ruleMetricId + * Find all by application ID + * @param applicationId * @return */ - @Query(value = "SELECT count(tr.ruleId) from TaskResult tr where tr.ruleMetricId = ?1") - int countRulesByRuleMetric(Long ruleMetricId); + @Query(value = "SELECT tr from TaskResult tr where tr.applicationId = ?1") + List findByApplicationId(String applicationId); } diff --git a/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/TaskResultStatusRepository.java b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/TaskResultStatusRepository.java new file mode 100644 index 00000000..cbe252ad --- /dev/null +++ b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/TaskResultStatusRepository.java @@ -0,0 +1,23 @@ +package com.webank.wedatasphere.qualitis.dao.repository; + +import com.webank.wedatasphere.qualitis.entity.TaskResultStatus; +import org.springframework.data.jpa.repository.JpaRepository; + +import java.util.List; + +/** + * @author v_minminghe@webank.com + * @date 2022-11-14 9:59 + * @description + */ +public interface TaskResultStatusRepository extends JpaRepository { + + /** + * findByApplicationIdAndRuleIdAndStatus + * @param applicationId + * @param ruleId + * @param status + * @return + */ + List findByApplicationIdAndRuleIdAndStatus(String applicationId, Long ruleId, Integer status); +} diff --git a/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/entity/TaskResult.java b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/entity/TaskResult.java index b17ba9ae..1a3d15a3 100644 --- a/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/entity/TaskResult.java +++ b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/entity/TaskResult.java @@ -46,6 +46,16 @@ public class TaskResult { private Long runDate; @Column(name = "department_code") private String departmentCode; + @Column(name = "version") + private String version; + @Column(name = "env_name") + private String envName; + + @Column(name = "compare_value") + private String compareValue; + + @Column(name = "denoising_value") + private Boolean denoisingValue; public TaskResult() { // Default Constructor @@ -130,4 +140,36 @@ public String getDepartmentCode() { public void setDepartmentCode(String departmentCode) { this.departmentCode = departmentCode; } + + public String getVersion() { + return version; + } + + public void setVersion(String version) { + this.version = version; + } + + public String getEnvName() { + return envName; + } + + public void setEnvName(String envName) { + this.envName = envName; + } + + public String getCompareValue() { + return compareValue; + } + + public void setCompareValue(String compareValue) { + this.compareValue = compareValue; + } + + public Boolean getDenoisingValue() { + return denoisingValue; + } + + public void setDenoisingValue(Boolean denoisingValue) { + this.denoisingValue = denoisingValue; + } } diff --git a/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/entity/TaskResultStatus.java b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/entity/TaskResultStatus.java new file mode 100644 index 00000000..42407392 --- /dev/null +++ b/core/analysis/src/main/java/com/webank/wedatasphere/qualitis/entity/TaskResultStatus.java @@ -0,0 +1,80 @@ +package com.webank.wedatasphere.qualitis.entity; + +import javax.persistence.*; + +/** + * @author v_minminghe@webank.com + * @date 2022-11-14 9:51 + * @description + */ +@Entity +@Table(name = "qualitis_application_task_result_status") +public class TaskResultStatus { + + @Id + @GeneratedValue(strategy = GenerationType.IDENTITY) + private Long id; + + @Column(name = "application_id") + private String applicationId; + + @Column + private Long ruleId; + + @OneToOne + private TaskResult taskResult; + + @Column(name = "task_rule_alarm_config_id") + private Long taskRuleAlarmConfigId; + + @Column + private Integer status; + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public String getApplicationId() { + return applicationId; + } + + public TaskResult getTaskResult() { + return taskResult; + } + + public void setTaskResult(TaskResult taskResult) { + this.taskResult = taskResult; + } + + public void setApplicationId(String applicationId) { + this.applicationId = applicationId; + } + + public Long getTaskRuleAlarmConfigId() { + return taskRuleAlarmConfigId; + } + + public void setTaskRuleAlarmConfigId(Long taskRuleAlarmConfigId) { + this.taskRuleAlarmConfigId = taskRuleAlarmConfigId; + } + + public Integer getStatus() { + return status; + } + + public void setStatus(Integer status) { + this.status = status; + } + + public Long getRuleId() { + return ruleId; + } + + public void setRuleId(Long ruleId) { + this.ruleId = ruleId; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/LocalConfig.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/LocalConfig.java index 06cca527..b5f3709c 100644 --- a/core/common/src/main/java/com/webank/wedatasphere/qualitis/LocalConfig.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/LocalConfig.java @@ -17,8 +17,13 @@ public class LocalConfig { /** * dev or prod. */ - @Value("${front_end.center}") - private String center; + @Value("${front_end.support_migrate}") + private Boolean supportMigrate; + /** + * BDAP or BDP. + */ + @Value("${front_end.cluster}") + private String cluster; public String getLocal() { return local; @@ -28,11 +33,19 @@ public void setLocal(String local) { this.local = local; } - public String getCenter() { - return center; + public Boolean getSupportMigrate() { + return supportMigrate; + } + + public void setSupportMigrate(Boolean supportMigrate) { + this.supportMigrate = supportMigrate; + } + + public String getCluster() { + return cluster; } - public void setCenter(String center) { - this.center = center; + public void setCluster(String cluster) { + this.cluster = cluster; } } diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/constants/QualitisConstants.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/constants/QualitisConstants.java new file mode 100644 index 00000000..01f6b4e1 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/constants/QualitisConstants.java @@ -0,0 +1,322 @@ +package com.webank.wedatasphere.qualitis.constants; + +import org.apache.commons.lang3.time.FastDateFormat; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.io.BufferedReader; +import java.io.InputStream; +import java.io.InputStreamReader; +import java.net.HttpURLConnection; +import java.net.InetAddress; +import java.net.URL; +import java.net.UnknownHostException; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.List; +import java.util.regex.Matcher; +import java.util.regex.Pattern; + +/** + * @author allenzhou@webank.com + * @date 2022/8/25 17:15 + */ +public class QualitisConstants { + + private static final Logger LOGGER = LoggerFactory.getLogger(QualitisConstants.class); + + /** + * 导入导出内置变量 + */ + public static final String EXECUTE_USER = "rule_execute_user"; + public static final String EXECUTE_CLUSETER = "rule_execute_cluster"; + public static final String WTSS_DEPLOY_USER = "wtss_deploy_user"; + public static final String WTSS_DEPLOY_CLUSETER = "wtss_deploy_cluster"; + + /** + * File suffix + */ + public static final String SUPPORT_CONFIG_SUFFIX_NAME = ".properties"; + public static final String SUPPORT_SCALA_SUFFIX_NAME = ".scala"; + public static final String SUPPORT_EXCEL_SUFFIX_NAME = ".xlsx"; + public static final String SUPPORT_PYTHON_SUFFIX_NAME = ".py"; + public static final String SUPPORT_JAR_SUFFIX_NAME = ".jar"; + public static final String SUPPORT_ZIP_SUFFIX_NAME = ".zip"; + + /** + * Cluster type + */ + public static final String BDAP = "BDAP"; + public static final String BDP = "BDP"; + + /** + * Branch Master + */ + public static final String MASTER = "master"; + + /** + * Ordinary num + */ + public static final int LENGTH_TWO = 2; + + /** + * Group execution num + */ + public static final int ONLY_ONE_GROUP = 1; + + /** + * Check alert default alert content columns' num + */ + public static final int DEFAULT_CONTENT_COLUMN_LENGTH = 5; + + /** + * Dss node version num + */ + public static final Integer DSS_NODE_VERSION_NUM = 5; + + /** + * Common array index to fix magical value + */ + public static final int COMMON_ARRAY_INDEX_O = 0; + public static final int COMMON_ARRAY_INDEX_1 = 1; + public static final int COMMON_ARRAY_INDEX_2 = 2; + public static final int COMMON_ARRAY_INDEX_3 = 3; + public static final int COMMON_ARRAY_INDEX_4 = 4; + + /** + * Role name + */ + public static final String ADMIN = "ADMIN"; + public static final String PROJECTOR = "PROJECTOR"; + + /** + * Subsystem ID + */ + public static final Integer SUB_SYSTEM_ID = 5375; + + /** + * Key of compared value + */ + public static final String AVG_OF_CURRENT = "avgOfCurrent"; + public static final String AVG_OF_LAST_CYCLE = "avgOfLastCycle"; + + /** + * Rule datasource + */ + public static final Integer ORIGINAL_INDEX = -1; + public static final Integer RIGHT_INDEX = 1; + public static final Integer LEFT_INDEX = 0; + public static final String MAP_TYPE = "map"; + public static final String ARRAY_TYPE = "array"; + public static final String STRUCT_TYPE = "struct"; + + /** + * Datasouce management. + */ + public static final String UNION_ALL = "All"; + public static final Pattern DATA_SOURCE_ID = Pattern.compile("\\.\\(ID=[0-9]+\\{[0-9,]+\\}\\)"); + public static final Pattern DATA_SOURCE_NAME = Pattern.compile("\\.\\(NAME=[\\u4E00-\\u9FA5A-Za-z0-9_]+\\{[\\u4E00-\\u9FA5A-Za-z0-9_,]+\\}\\)"); + + /** + * Date format + */ + public static final FastDateFormat PRINT_DATE_FORMAT = FastDateFormat.getInstance("yyyy-MM-dd"); + public static final FastDateFormat FILE_DATE_FORMATTER = FastDateFormat.getInstance("yyyyMMddHHmmss"); + public static final FastDateFormat PRINT_DATE_FORMAT_ELIMINATE = FastDateFormat.getInstance("yyyyMMdd"); + public static final FastDateFormat PRINT_TIME_FORMAT = FastDateFormat.getInstance("yyyy-MM-dd HH:mm:ss"); + + /** + * AlarmEventEnum + * 告警事件: + * 校验成功(1):仅通过校验的任务 + * 校验失败(2):未通过校验+阻断、引擎层失败的任务 + * 执行完成(3):通过校验,未通过校验 + */ + public static final Integer CHECK_SUCCESS = 1; + public static final Integer CHECK_FAILURE = 2; + public static final Integer EXECUTION_COMPLETED = 3; + + /** + * qualitis_template_default_input_meta, 0.23.0版本单表ID + */ + public static final List SINGLE_TABLE = Arrays.asList(17, 18, 19, 20, 21, 22, 23, 33); + /** + * qualitis_template_default_input_meta, 0.23.0版本跨表ID + */ + public static final List CROSS_TABLE = Arrays.asList(17, 18, 20, 24, 25, 26, 27, 28, 29, 30, 31, 32); + /** + * qualitis_template_default_input_meta, 0.23.0版本文件ID + */ + public static final List FILE_TABLE = Arrays.asList(17, 18, 20); + /** + * 剔除旧数据的占位符与0.23.0版本input_type不匹配的情况 + */ + public static final List ELIMINATE_PLACEHOLDER = Arrays.asList(1, 7, 10, 20, 21, 22, 23, 25, 36, 37, 38); + + public static final List OVER_TABLE_TYPE = Arrays.asList("11", "12", "13", "14", "30", "31"); + public static final String ROW_DATA_CONSISTENCY_VERIFICATION="行数据一致性校验"; + + /** + * 引擎配置 json数据格式name + */ + public static final List ENGINE_CONFIGURATION = Arrays.asList("spark引擎资源上限", "worker资源设置", "spark引擎资源设置", "spark资源设置"); + + /** + * 数值范围(最大值、最小值、中间表达式) + */ + public static final String INTERMEDIATE_PLACEHOLDER = "intermediate_expression"; + public static final String INTERMEDIATE_EXPRESSION = "{&INTERMEDIATE_EXPRESSION}"; + public static final String INTERMEDIATE_PLACEHOLDER_DESCRIPTION = "{&REPLACE_PLACEHOLDER_IN_SQL}${intermediate_expression}"; + + public static final String MAXIMUM = "{&MAXIMUM}"; + public static final String MAXIMUM_PLACEHOLDER = "maximum"; + public static final String MAXIMUM_PLACEHOLDER_DESCRIPTION = "{&REPLACE_PLACEHOLDER_IN_SQL}${maximum}"; + + public static final String MINIMUM = "{&MINIMUM}"; + public static final String MINIMUM_PLACEHOLDER = "minimum"; + public static final String MINIMUM_PLACEHOLDER_DESCRIPTION = "{&REPLACE_PLACEHOLDER_IN_SQL}${minimum}"; + + /** + * qualitis_template_output_meta 校验值(output_name) 英文名(output_en_name) + */ + public static final String DISSATISFIED_EN_NAME = "Number of dissatisfied en_name"; + public static final String DISSATISFACTION = "不满足"; + public static final String NUMS = "的数量"; + + /** + * qualitis_template_statistic_input_meta result_type属性 + */ + public static final String DATA_TYPE_LONG = "Long"; + + /** + * show_sql * 替换 + */ + public static final String ASTERISK = "\\*"; + + /** + * 日志级别 + */ + public static final String LOG_INFO = " INFO "; + public static final String LOG_WARN = " WARN "; + public static final String LOG_ERROR = " ERROR "; + + + public static final String FPS_DEFAULT_USER = "hadoop"; + + /** + * @Description:获取客户端内网ip + */ + public static String QUALITIS_SERVER_HOST; + + static { + try { + QUALITIS_SERVER_HOST = InetAddress.getLocalHost().getHostAddress(); + } catch (UnknownHostException e) { + LOGGER.error(e.getMessage(), e); + } + } + + /** + * @Description:获取客户端外网ip 此方法要接入互联网才行,内网不行 + **/ + public static String getPublicIp() { + try { + // 要获得html页面内容的地址 + String path = "http://www.net.cn/static/customercare/yourip.asp"; + // 创建url对象 + URL url = new URL(path); + // 打开连接 + HttpURLConnection conn = (HttpURLConnection) url.openConnection(); + // 设置url中文参数编码 + conn.setRequestProperty("contentType", "GBK"); + // 请求的时间 + conn.setConnectTimeout(5 * 1000); + // 请求方式 + conn.setRequestMethod("GET"); + InputStream inStream = conn.getInputStream(); + BufferedReader in = new BufferedReader(new InputStreamReader( + inStream, "GBK")); + StringBuffer buffer = new StringBuffer(); + String line = ""; + // 读取获取到内容的最后一行,写入 + while ((line = in.readLine()) != null) { + buffer.append(line); + } + List ips = new ArrayList(); + + //用正则表达式提取String字符串中的IP地址 + String regEx = "((2[0-4]\\d|25[0-5]|[01]?\\d\\d?)\\.){3}(2[0-4]\\d|25[0-5]|[01]?\\d\\d?)"; + String str = buffer.toString(); + Pattern p = Pattern.compile(regEx); + Matcher m = p.matcher(str); + while (m.find()) { + String result = m.group(); + ips.add(result); + } + String PublicIp = ips.get(0); + + // 返回公网IP值 + return PublicIp; + } catch (Exception e) { + LOGGER.error("获取公网IP连接超时"); + return ""; + } + } + + public static final String DEFAULT_NODE_NAME = "qualitis_0000"; + public static final String CHECKALERT_NODE_NAME_PREFIX = "checkalert"; + + /** + * Rule group default filter placeholder + */ + public static final String RULE_GROUP_FILTER_PLACEHOLDER = "${table_value_filter}"; + + public static final Long EXPECT_LINES_NOT_REPEAT_ID = 2149L; + public static final Long EXPECT_DATA_NOT_REPEAT_ID = 4000L; + + /** + * Execution param variables + */ + public static final String QUALITIS_DELETE_FAIL_CHECK_RESULT = "qualitis_delete_fail_check_result"; + public static final String QUALITIS_UPLOAD_RULE_METRIC_VALUE = "qualitis_upload_rule_metric_value"; + public static final String QUALITIS_UPLOAD_ABNORMAL_VALUE = "qualitis_upload_abnormal_value"; + public static final String QUALITIS_ALERT_RECEIVERS = "qualitis_alert_receivers"; + public static final String QUALITIS_CLUSTER_NAME = "qualitis_cluster_name"; + public static final String QUALITIS_ALERT_LEVEL = "qualitis_alert_level"; + + public static final String QUALITIS_ENGINE_REUSE = "engine_reuse"; + public static final String QUALITIS_STARTUP_PARAM = "qualitis_startup_param"; + public static final String QUALITIS_ENGINE_TYPE = "qualitis.linkis.engineType"; + public static final String QUALITIS_MID_TABLE_REUSE = "mid_table_reuse"; + public static final String QUALITIS_UNION_ALL_SAVE = "union_all_save"; + + + /** + * Set flag + */ + public static final String SPARK_SET_FLAG = "qualitis.spark.set."; + /** + * Special splitor + */ + public static final String AND = "and"; + + /** + * other + */ + public static final int APPLICATION_RANDOM_LENGTH = 6; + + /** + * 数据源管理 + */ + public static final int DATASOURCE_MANAGER_INPUT_TYPE_MANUAL = 1; + public static final int DATASOURCE_MANAGER_INPUT_TYPE_AUTO = 2; + public static final int DATASOURCE_MANAGER_VERIFY_TYPE_SHARE = 1; + public static final int DATASOURCE_MANAGER_VERIFY_TYPE_NON_SHARE = 2; + + /** + * 认证方式 + */ + public static final String AUTH_TYPE_ACCOUNT_PWD = "accountPwd"; + public static final String AUTH_TYPE_DPM = "dpm"; + +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/constants/ResponseStatusConstants.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/constants/ResponseStatusConstants.java new file mode 100644 index 00000000..9b1168a6 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/constants/ResponseStatusConstants.java @@ -0,0 +1,16 @@ +package com.webank.wedatasphere.qualitis.constants; + +/** + * @author v_minminghe@webank.com + * @date 2023-05-15 10:03 + * @description + */ +public class ResponseStatusConstants { + + /** + * 统一的接口响应状态码 + */ + public static final String OK = "200"; + public static final String SERVER_ERROR = "500"; + +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/constants/ThirdPartyConstants.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/constants/ThirdPartyConstants.java new file mode 100644 index 00000000..ef7d4bdb --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/constants/ThirdPartyConstants.java @@ -0,0 +1,22 @@ +package com.webank.wedatasphere.qualitis.constants; + +/** + * @author v_gaojiedeng@webank.com + */ +public class ThirdPartyConstants { + + public static final String WTSS_RES_SESSION_ID = "session.id"; + public static final String WTSS_RES_ERROR = "error"; + public static final String WTSS_STATUS_CODE = "200"; + public static final String WTSS_RESPONSE_CODE = "code"; + /** + * 1 前置规则组 + */ + public static final String WTSS_FRONT_TYPE = "1"; + /** + * 2 后置规则组 + */ + public static final String WTSS_BACK_TYPE = "2"; + public static final String WTSS_RSA = "RSA"; + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/constant/EventTypeEnum.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/constants/WhiteListTypeEnum.java similarity index 68% rename from core/project/src/main/java/com/webank/wedatasphere/qualitis/project/constant/EventTypeEnum.java rename to core/common/src/main/java/com/webank/wedatasphere/qualitis/constants/WhiteListTypeEnum.java index 4d35975b..f9800fee 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/constant/EventTypeEnum.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/constants/WhiteListTypeEnum.java @@ -14,22 +14,22 @@ * limitations under the License. */ -package com.webank.wedatasphere.qualitis.project.constant; +package com.webank.wedatasphere.qualitis.constants; /** * @author allenzhou */ -public enum EventTypeEnum { +public enum WhiteListTypeEnum { /** - * Type of project + * White list type */ - MODIFY_PROJECT(1, "Modify Project"), - SUBMIT_PROJECT(2, "Submit Project"),; + CHECK_ALERT_TABLE(1, "Check Alert Table") + ; private Integer code; private String message; - EventTypeEnum(Integer code, String message) { + WhiteListTypeEnum(Integer code, String message) { this.code = code; this.message = message; } @@ -38,15 +38,7 @@ public Integer getCode() { return code; } - public void setCode(Integer code) { - this.code = code; - } - public String getMessage() { return message; } - - public void setMessage(String message) { - this.message = message; - } } diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/PermissionDeniedRequestException.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/PermissionDeniedRequestException.java index c4d29379..60523d98 100644 --- a/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/PermissionDeniedRequestException.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/PermissionDeniedRequestException.java @@ -33,10 +33,14 @@ public PermissionDeniedRequestException(String message, Integer status) { super(message); this.status = status; } - public GeneralResponse getResponse() { + public GeneralResponse getResponse() { return new GeneralResponse<>(this.status + "", getMessage(), null); } + public GeneralResponse getResponse(String message) { + return new GeneralResponse<>(this.status + "", message, null); + } + public Integer getStatus() { return status; } diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/UnExpectedRequestException.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/UnExpectedRequestException.java index 68d65d03..46f06ec8 100644 --- a/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/UnExpectedRequestException.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/UnExpectedRequestException.java @@ -34,10 +34,14 @@ public UnExpectedRequestException(String message, Integer status) { this.status = status; } - public GeneralResponse getResponse() { + public GeneralResponse getResponse() { return new GeneralResponse<>(this.status + "", getMessage(), null); } + public GeneralResponse getResponse(String message) { + return new GeneralResponse<>(this.status + "", message, null); + } + public Integer getStatus() { return status; } diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/mapper/PermissionDeniedUserRequestExceptionMapper.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/mapper/PermissionDeniedUserRequestExceptionMapper.java index a27eb2e5..ece47692 100644 --- a/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/mapper/PermissionDeniedUserRequestExceptionMapper.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/mapper/PermissionDeniedUserRequestExceptionMapper.java @@ -16,15 +16,16 @@ package com.webank.wedatasphere.qualitis.exception.mapper; import com.webank.wedatasphere.qualitis.exception.PermissionDeniedRequestException; -import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; import com.webank.wedatasphere.qualitis.parser.LocaleParser; -import javax.ws.rs.core.Response; -import javax.ws.rs.ext.ExceptionMapper; -import javax.ws.rs.ext.Provider; +import org.apache.commons.lang.StringUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; +import javax.ws.rs.core.Response; +import javax.ws.rs.ext.ExceptionMapper; +import javax.ws.rs.ext.Provider; + /** * @author allenzhou * @@ -43,6 +44,6 @@ public class PermissionDeniedUserRequestExceptionMapper implements ExceptionMapp public Response toResponse(PermissionDeniedRequestException exception) { String message = localeParser.replacePlaceHolderByLocale(exception.getMessage(), "en_US"); LOGGER.warn(message, exception); - return Response.ok(exception.getResponse()).status(exception.getStatus()).build(); + return Response.ok(StringUtils.isNotEmpty(message) ? exception.getResponse(message) : exception.getResponse()).status(exception.getStatus()).build(); } } diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/mapper/UnExpectedUserRequestExceptionMapper.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/mapper/UnExpectedUserRequestExceptionMapper.java index 6feb8998..47b1b4fd 100644 --- a/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/mapper/UnExpectedUserRequestExceptionMapper.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/exception/mapper/UnExpectedUserRequestExceptionMapper.java @@ -17,6 +17,7 @@ import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; import com.webank.wedatasphere.qualitis.parser.LocaleParser; +import org.apache.commons.lang.StringUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; @@ -42,6 +43,6 @@ public class UnExpectedUserRequestExceptionMapper implements ExceptionMapper getDatabase(String searchKey, String loginUser) throws MetaDataAcquireFailedException, UnExpectedRequestException; + + /** + * Get dataset info. + * + * @param dbId + * @param datasetName + * @param page + * @param size + * @param loginUser + * @return + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + */ + Map getDataset(String dbId, String datasetName, int page, int size, String loginUser) throws MetaDataAcquireFailedException, UnExpectedRequestException; + + /** + * Get column standard code. + * + * @param dataSetId + * @param fieldName + * @param loginUser + * @return + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + */ + Map getColumnStandard(Long dataSetId, String fieldName, String loginUser) throws MetaDataAcquireFailedException, UnExpectedRequestException; + + /** + * Get data standard info from datamap with token authentication. + * + * @param stdCode + * @param source + * @param loginUser + * @return + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + */ + Map getDataStandardDetail(String stdCode, String source, String loginUser) throws MetaDataAcquireFailedException, UnExpectedRequestException; + + /** + * get Data Standard Category + * + * @param page + * @param size + * @param loginUser + * @param stdSubName + * @return + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + */ + Map getDataStandardCategory(int page, int size, String loginUser, String stdSubName) throws MetaDataAcquireFailedException, UnExpectedRequestException; + + + /** + * get Data Standard Big Category + * + * @param page + * @param size + * @param loginUser + * @param stdSubName + * @param stdBigCategoryName + * @return + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + */ + Map getDataStandardBigCategory(int page, int size, String loginUser, String stdSubName, String stdBigCategoryName) throws MetaDataAcquireFailedException, UnExpectedRequestException; + + + /** + * get Data Standard Small Category + * + * @param page + * @param size + * @param loginUser + * @param stdSubName + * @param stdBigCategoryName + * @param smallCategoryName + * @return + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + */ + Map getDataStandardSmallCategory(int page, int size, String loginUser, String stdSubName, String stdBigCategoryName, String smallCategoryName) throws MetaDataAcquireFailedException, UnExpectedRequestException; + + + /** + * get Data Standard. + * + * @param page + * @param size + * @param loginUser + * @param stdSmallCategoryUrn + * @param stdCnName + * @return + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + * @throws URISyntaxException + */ + Map getDataStandard(int page, int size, String loginUser, String stdSmallCategoryUrn, String stdCnName) throws MetaDataAcquireFailedException, UnExpectedRequestException, URISyntaxException; + + /** + * get Standard Code. + * + * @param stdUrn + * @param page + * @param size + * @param loginUser + * @return + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + * @throws URISyntaxException + */ + Map getStandardCode(int page, int size, String loginUser, String stdUrn) throws MetaDataAcquireFailedException, UnExpectedRequestException, URISyntaxException; + + /** + * get Standard Code Table + * + * @param stdCode + * @param page + * @param size + * @param loginUser + * @return + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + */ + Map getStandardCodeTable(int page, int size, String loginUser, String stdCode) throws MetaDataAcquireFailedException, UnExpectedRequestException; + +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/LinkisMetaDataManager.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/LinkisMetaDataManager.java new file mode 100644 index 00000000..8b893fe3 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/LinkisMetaDataManager.java @@ -0,0 +1,94 @@ +package com.webank.wedatasphere.qualitis.metadata.client; + +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; +import com.webank.wedatasphere.qualitis.metadata.exception.MetaDataAcquireFailedException; +import com.webank.wedatasphere.qualitis.metadata.request.LinkisDataSourceEnvRequest; +import com.webank.wedatasphere.qualitis.metadata.request.LinkisDataSourceRequest; +import com.webank.wedatasphere.qualitis.metadata.request.ModifyDataSourceParameterRequest; +import com.webank.wedatasphere.qualitis.metadata.response.datasource.LinkisDataSourceParamsResponse; + +import java.util.List; + +/** + * @author v_minminghe@webank.com + * @date 2023-05-12 10:59 + * @description + */ +public interface LinkisMetaDataManager { + + /** + * create Data Source + * + * @param linkisDataSourceRequest + * @param cluster + * @param authUser + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + Long createDataSource(LinkisDataSourceRequest linkisDataSourceRequest, String cluster, String authUser) throws UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * modify Data Source + * + * @param linkisDataSourceRequest + * @param cluster + * @param authUser + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + Long modifyDataSource(LinkisDataSourceRequest linkisDataSourceRequest, String cluster, String authUser) throws UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * create Data Source Env + * + * @param inputType + * @param verifyType + * @param linkisDataSourceEnvRequestList + * @param clusterName + * @param authUser + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + List createDataSourceEnv(Integer inputType, Integer verifyType, List linkisDataSourceEnvRequestList, String clusterName, String authUser) throws UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * modify Data Source Env + * + * @param inputType + * @param verifyType + * @param linkisDataSourceEnvRequestList + * @param clusterName + * @param authUser + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + List modifyDataSourceEnv(Integer inputType, Integer verifyType, List linkisDataSourceEnvRequestList, String clusterName, String authUser) throws UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * modify Data Source Params + * + * @param modifyDataSourceParameterRequest + * @param clusterName + * @param authUser + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + LinkisDataSourceParamsResponse modifyDataSourceParams(ModifyDataSourceParameterRequest modifyDataSourceParameterRequest, String clusterName, String authUser) throws UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * delete Data Source + * + * @param linkisDataSourceId + * @param clusterName + * @param userName + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + void deleteDataSource(Long linkisDataSourceId, String clusterName, String userName) throws UnExpectedRequestException, MetaDataAcquireFailedException; + +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/MetaDataClient.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/MetaDataClient.java index 928525a7..cb191ae3 100644 --- a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/MetaDataClient.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/MetaDataClient.java @@ -17,27 +17,32 @@ package com.webank.wedatasphere.qualitis.metadata.client; -import com.webank.wedatasphere.qualitis.metadata.exception.MetaDataAcquireFailedException; +import com.fasterxml.jackson.core.JsonProcessingException; +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; import com.webank.wedatasphere.qualitis.metadata.request.GetClusterByUserRequest; +import com.webank.wedatasphere.qualitis.metadata.exception.MetaDataAcquireFailedException; +import com.webank.wedatasphere.qualitis.metadata.response.datasource.LinkisDataSourceInfoDetail; import com.webank.wedatasphere.qualitis.metadata.request.GetColumnByUserAndTableRequest; import com.webank.wedatasphere.qualitis.metadata.request.GetDbByUserAndClusterRequest; import com.webank.wedatasphere.qualitis.metadata.request.GetTableByUserAndDbRequest; -import com.webank.wedatasphere.qualitis.metadata.request.GetUserColumnByCsRequest; -import com.webank.wedatasphere.qualitis.metadata.request.GetUserTableByCsIdRequest; -import com.webank.wedatasphere.qualitis.metadata.response.DataInfo; import com.webank.wedatasphere.qualitis.metadata.response.cluster.ClusterInfoDetail; +import com.webank.wedatasphere.qualitis.metadata.request.GetUserTableByCsIdRequest; +import com.webank.wedatasphere.qualitis.metadata.request.GetUserColumnByCsRequest; import com.webank.wedatasphere.qualitis.metadata.response.column.ColumnInfoDetail; +import com.webank.wedatasphere.qualitis.metadata.response.DataInfo; import com.webank.wedatasphere.qualitis.metadata.response.db.DbInfoDetail; import com.webank.wedatasphere.qualitis.metadata.response.table.CsTableInfoDetail; import com.webank.wedatasphere.qualitis.metadata.response.table.PartitionStatisticsInfo; -import com.webank.wedatasphere.qualitis.metadata.response.table.TableInfoDetail; -import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; import com.webank.wedatasphere.qualitis.metadata.response.table.TableStatisticsInfo; +import com.webank.wedatasphere.qualitis.metadata.response.table.TableInfoDetail; import com.webank.wedatasphere.qualitis.response.GeneralResponse; +import org.json.JSONException; +import java.io.File; + +import java.io.IOException; import java.io.UnsupportedEncodingException; import java.util.List; import java.util.Map; -import org.springframework.web.client.RestClientException; /** * @author howeye @@ -89,11 +94,10 @@ String getTableBasicInfo(String clusterName, String dbName, String tableName, St * Get table by context service ID and DSS node name * @param request * @return - * @throws MetaDataAcquireFailedException - * @throws UnExpectedRequestException + * @throws Exception */ DataInfo getTableByCsId(GetUserTableByCsIdRequest request) - throws MetaDataAcquireFailedException, UnExpectedRequestException; + throws Exception; /** * Get column by user and table @@ -112,21 +116,19 @@ DataInfo getColumnByUserAndTable(GetColumnByUserAndTableReques * @param tableName * @param userName * @return - * @throws MetaDataAcquireFailedException - * @throws UnExpectedRequestException + * @throws Exception */ List getColumnInfo(String clusterName, String dbName, String tableName, String userName) - throws MetaDataAcquireFailedException, UnExpectedRequestException; + throws Exception; /** * Get column by context service ID and table context key * @param request * @return - * @throws MetaDataAcquireFailedException - * @throws UnExpectedRequestException + * @throws Exception */ DataInfo getColumnByCsId(GetUserColumnByCsRequest request) - throws MetaDataAcquireFailedException, UnExpectedRequestException; + throws Exception; /** * Get table statistics info. @@ -135,12 +137,10 @@ DataInfo getColumnByCsId(GetUserColumnByCsRequest request) * @param tableName * @param user * @return - * @throws UnExpectedRequestException - * @throws MetaDataAcquireFailedException - * @throws RestClientException + * @throws Exception */ TableStatisticsInfo getTableStatisticsInfo(String clusterName, String dbName, String tableName, String user) - throws UnExpectedRequestException, MetaDataAcquireFailedException, RestClientException; + throws Exception; /** * Get partition statistics info. @@ -150,12 +150,10 @@ TableStatisticsInfo getTableStatisticsInfo(String clusterName, String dbName, St * @param partitionPath * @param user * @return - * @throws UnExpectedRequestException - * @throws MetaDataAcquireFailedException - * @throws RestClientException + * @throws Exception */ PartitionStatisticsInfo getPartitionStatisticsInfo(String clusterName, String dbName, String tableName, String partitionPath, String user) - throws UnExpectedRequestException, MetaDataAcquireFailedException, RestClientException; + throws Exception; /** * Check field. @@ -174,7 +172,7 @@ PartitionStatisticsInfo getPartitionStatisticsInfo(String clusterName, String db * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException */ - GeneralResponse getAllDataSourceTypes(String clusterName, String userName) + GeneralResponse> getAllDataSourceTypes(String clusterName, String userName) throws UnExpectedRequestException, MetaDataAcquireFailedException; /** @@ -185,7 +183,45 @@ GeneralResponse getAllDataSourceTypes(String clusterName, String userName) * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException */ - GeneralResponse getDataSourceEnv(String clusterName, String userName) throws UnExpectedRequestException, MetaDataAcquireFailedException; + GeneralResponse> getDataSourceEnv(String clusterName, String userName) throws UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * Create data source env. + * @param clusterName + * @param authUser + * @param createSystem + * @param datasourceEnvs + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + * @throws JSONException + */ + GeneralResponse> createDataSourceEnvBatch(String clusterName, String authUser, String createSystem, String datasourceEnvs) throws UnExpectedRequestException, MetaDataAcquireFailedException, JSONException; + + /** + * Modify data source env. + * @param clusterName + * @param authUser + * @param createSystem + * @param datasourceEnvs + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + * @throws JSONException + */ + GeneralResponse> modifyDataSourceEnvBatch(String clusterName, String authUser, String createSystem, String datasourceEnvs) throws UnExpectedRequestException, MetaDataAcquireFailedException, JSONException; + + /** + * Get datasource env by id. + * @param clusterName + * @param authUser + * @param envId + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + * @throws JsonProcessingException + */ + GeneralResponse> getDatasourceEnvById(String clusterName, String authUser, Long envId) throws UnExpectedRequestException, MetaDataAcquireFailedException, JsonProcessingException; /** * Get data source info pageable. @@ -200,9 +236,23 @@ GeneralResponse getAllDataSourceTypes(String clusterName, String userName) * @throws MetaDataAcquireFailedException * @throws UnsupportedEncodingException */ - GeneralResponse getDataSourceInfoPage(String clusterName, String userName, int page, int size, String searchName, Long typeId) + GeneralResponse> getDataSourceInfoPage(String clusterName, String userName, int page, int size, String searchName, Long typeId) throws UnExpectedRequestException, MetaDataAcquireFailedException, UnsupportedEncodingException; + /** + * Get data source info by ids + * @param clusterName + * @param userName + * @param dataSourceIds + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + * @throws UnsupportedEncodingException + * @throws IOException + */ + GeneralResponse> getDataSourceInfoByIds(String clusterName, String userName, List dataSourceIds) + throws UnExpectedRequestException, MetaDataAcquireFailedException, IOException; + /** * Get data source versions. * @param clusterName @@ -212,7 +262,7 @@ GeneralResponse getDataSourceInfoPage(String clusterName, String userName, * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException */ - GeneralResponse getDataSourceVersions(String clusterName, String userName, Long dataSourceId) throws UnExpectedRequestException, MetaDataAcquireFailedException; + GeneralResponse> getDataSourceVersions(String clusterName, String userName, Long dataSourceId) throws UnExpectedRequestException, MetaDataAcquireFailedException; /** * Get data source info detail. @@ -221,10 +271,9 @@ GeneralResponse getDataSourceInfoPage(String clusterName, String userName, * @param dataSourceId * @param versionId * @return - * @throws UnExpectedRequestException - * @throws MetaDataAcquireFailedException + * @throws Exception */ - GeneralResponse getDataSourceInfoDetail(String clusterName, String userName, Long dataSourceId, Long versionId) throws UnExpectedRequestException, MetaDataAcquireFailedException; + GeneralResponse> getDataSourceInfoDetail(String clusterName, String userName, Long dataSourceId, Long versionId) throws Exception; /** * Get data source info detail by name. @@ -235,7 +284,7 @@ GeneralResponse getDataSourceInfoPage(String clusterName, String userName, * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException */ - GeneralResponse getDataSourceInfoDetailByName(String clusterName, String authUser, + GeneralResponse> getDataSourceInfoDetailByName(String clusterName, String authUser, String dataSourceName) throws UnExpectedRequestException, MetaDataAcquireFailedException; /** @@ -247,7 +296,7 @@ GeneralResponse getDataSourceInfoDetailByName(String clusterName, String au * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException */ - GeneralResponse getDataSourceKeyDefine(String clusterName, String userName, Long keyId) throws UnExpectedRequestException, MetaDataAcquireFailedException; + GeneralResponse> getDataSourceKeyDefine(String clusterName, String userName, Long keyId) throws UnExpectedRequestException, MetaDataAcquireFailedException; /** * Connect to data source. @@ -257,8 +306,9 @@ GeneralResponse getDataSourceInfoDetailByName(String clusterName, String au * @return * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException + * @throws JSONException */ - GeneralResponse connectDataSource(String clusterName, String userName, String jsonRequest) throws UnExpectedRequestException, MetaDataAcquireFailedException; + GeneralResponse> connectDataSource(String clusterName, String userName, String jsonRequest) throws UnExpectedRequestException, MetaDataAcquireFailedException, JSONException; /** * Get connect params. @@ -267,11 +317,10 @@ GeneralResponse getDataSourceInfoDetailByName(String clusterName, String au * @param dataSourceId * @param versionId * @return - * @throws UnExpectedRequestException - * @throws MetaDataAcquireFailedException + * @throws Exception */ - GeneralResponse getDataSourceConnectParams(String clusterName, String authUser, Long dataSourceId, - Long versionId) throws UnExpectedRequestException, MetaDataAcquireFailedException; + GeneralResponse> getDataSourceConnectParams(String clusterName, String authUser, Long dataSourceId, + Long versionId) throws Exception; /** * Publish data source. @@ -283,7 +332,7 @@ GeneralResponse getDataSourceConnectParams(String clusterName, String authU * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException */ - GeneralResponse publishDataSource(String clusterName, String userName, Long dataSourceId, Long versionId) throws UnExpectedRequestException, MetaDataAcquireFailedException; + GeneralResponse> publishDataSource(String clusterName, String userName, Long dataSourceId, Long versionId) throws UnExpectedRequestException, MetaDataAcquireFailedException; /** * Expire data source. @@ -294,7 +343,7 @@ GeneralResponse getDataSourceConnectParams(String clusterName, String authU * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException */ - GeneralResponse expireDataSource(String clusterName, String userName, Long dataSourceId)throws UnExpectedRequestException, MetaDataAcquireFailedException; + GeneralResponse> expireDataSource(String clusterName, String userName, Long dataSourceId)throws UnExpectedRequestException, MetaDataAcquireFailedException; /** * Modify data source. @@ -305,8 +354,9 @@ GeneralResponse getDataSourceConnectParams(String clusterName, String authU * @return * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException + * @throws JSONException */ - GeneralResponse modifyDataSource(String clusterName, String userName, Long dataSourceId, String jsonRequest) throws UnExpectedRequestException, MetaDataAcquireFailedException; + GeneralResponse> modifyDataSource(String clusterName, String userName, Long dataSourceId, String jsonRequest) throws UnExpectedRequestException, MetaDataAcquireFailedException, JSONException; /** * Modify data source param. @@ -317,8 +367,9 @@ GeneralResponse getDataSourceConnectParams(String clusterName, String authU * @return * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException + * @throws JSONException */ - GeneralResponse modifyDataSourceParam(String clusterName, String userName, Long dataSourceId, String jsonRequest) throws UnExpectedRequestException, MetaDataAcquireFailedException; + GeneralResponse> modifyDataSourceParam(String clusterName, String userName, Long dataSourceId, String jsonRequest) throws UnExpectedRequestException, MetaDataAcquireFailedException, JSONException; /** * Create data source param. @@ -328,11 +379,12 @@ GeneralResponse getDataSourceConnectParams(String clusterName, String authU * @return * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException + * @throws JSONException */ - GeneralResponse createDataSource(String clusterName, String userName, String jsonRequest) throws UnExpectedRequestException, MetaDataAcquireFailedException; + GeneralResponse> createDataSource(String clusterName, String userName, String jsonRequest) throws UnExpectedRequestException, MetaDataAcquireFailedException, JSONException; /** - * Get db by data source. + * delete data source * @param clusterName * @param userName * @param dataSourceId @@ -340,19 +392,32 @@ GeneralResponse getDataSourceConnectParams(String clusterName, String authU * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException */ - Map getDbsByDataSource(String clusterName, String userName, Long dataSourceId) throws UnExpectedRequestException, MetaDataAcquireFailedException; + GeneralResponse> deleteDataSource(String clusterName, String userName, Long dataSourceId) throws UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * Get db by data source. + * @param clusterName + * @param userName + * @param dataSourceName + * @param envId + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + Map getDbsByDataSourceName(String clusterName, String userName, String dataSourceName, Long envId) throws UnExpectedRequestException, MetaDataAcquireFailedException; /** * Get table by data source. * @param clusterName * @param userName - * @param dataSourceId + * @param dataSourceName * @param dbName + * @param envId * @return * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException */ - Map getTablesByDataSource(String clusterName, String userName, Long dataSourceId, String dbName) throws UnExpectedRequestException, MetaDataAcquireFailedException; + Map getTablesByDataSourceName(String clusterName, String userName, String dataSourceName, String dbName, Long envId) throws UnExpectedRequestException, MetaDataAcquireFailedException; /** * Get column by data source. @@ -362,8 +427,234 @@ GeneralResponse getDataSourceConnectParams(String clusterName, String authU * @param dbName * @param tableName * @return + * @throws Exception + */ + @Deprecated + DataInfo getColumnsByDataSource(String clusterName, String userName, Long dataSourceId, String dbName, String tableName) throws Exception; + + /** + * Get column by data source. + * @param clusterName + * @param userName + * @param dataSourceName + * @param dbName + * @param tableName + * @param envId + * @return + * @throws Exception + */ + DataInfo getColumnsByDataSourceName(String clusterName, String userName, String dataSourceName, String dbName, String tableName, Long envId) throws Exception; + + /** + * Get undone task total. + * @param clusterName + * @param executionUser + * @return * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException */ - DataInfo getColumnsByDataSource(String clusterName, String userName, Long dataSourceId, String dbName, String tableName) throws UnExpectedRequestException, MetaDataAcquireFailedException; + int getUndoneTaskTotal(String clusterName, String executionUser) throws UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * Get data source name by id + * @param clusterName + * @param userName + * @param dataSourceId + * @return + * @throws Exception + */ + LinkisDataSourceInfoDetail getDataSourceInfoById(String clusterName, String userName, Long dataSourceId) throws Exception; + + /** + * Add udf + * @param currentCluster + * @param username + * @param requestBody + * @return + * @throws UnExpectedRequestException + * @throws IOException + * @throws JSONException + * @throws MetaDataAcquireFailedException + */ + Long addUdf(String currentCluster, String username, Map requestBody) + throws UnExpectedRequestException, IOException, JSONException, MetaDataAcquireFailedException; + + /** + * Modify udf + * @param currentCluster + * @param linkisUdfAdminUser + * @param requestBody + * @return + * @throws UnExpectedRequestException + * @throws IOException + * @throws JSONException + * @throws MetaDataAcquireFailedException + */ + void modifyUdf(String currentCluster, String linkisUdfAdminUser, Map requestBody) throws UnExpectedRequestException, IOException, JSONException, MetaDataAcquireFailedException; + + /** + * Check file path exists + * @param currentCluster + * @param userNmae + * @param uploadFile + * @param needUpload + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + * @throws IOException + * @throws JSONException + * @return + */ + String checkFilePathExistsAndUploadToWorkspace(String currentCluster, String userNmae, File uploadFile, Boolean needUpload) throws UnExpectedRequestException, MetaDataAcquireFailedException, IOException, JSONException; + + /** + * Client add + * @param currentCluster + * @param targetFilePath + * @param uploadFile + * @param fileName + * @param udfDesc + * @param udfName + * @param returnType + * @param enter + * @param registerName + * @param status + * @param dir + * @return + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + * @throws JSONException + * @throws IOException + */ + Long clientAdd(String currentCluster, String targetFilePath, File uploadFile, String fileName, String udfDesc, String udfName, String returnType + , String enter, String registerName, Boolean status, + String dir) throws MetaDataAcquireFailedException, UnExpectedRequestException, JSONException, IOException; + + /** + * Client modify + * @param targetFilePath + * @param uploadFile + * @param currentCluster + * @param clusterIdMaps + * @param fileName + * @param udfDesc + * @param udfName + * @param returnType + * @param enter + * @param registerName + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + * @throws JSONException + * @throws IOException + */ + void clientModify(String targetFilePath, File uploadFile, String currentCluster, Map clusterIdMaps, String fileName, String udfDesc + , String udfName, String returnType, String enter, + String registerName) throws MetaDataAcquireFailedException, UnExpectedRequestException, JSONException, IOException; + + /** + * Share and deploy + * @param udfId + * @param currentCluster + * @param proxyUserNames + * @param udfName + */ + void shareAndDeploy(Long udfId, String currentCluster, List proxyUserNames, String udfName); + + /** + * Get udf detail + * @param clusterName + * @param linkisUdfAdminUser + * @param linkisUdfId + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + Map getUdfDetail(String clusterName, String linkisUdfAdminUser, Long linkisUdfId) throws UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * Get directory + * @param category + * @param clusterName + * @param linkisUdfAdminUser + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + List getDirectory(String category, String clusterName, String linkisUdfAdminUser) throws UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * Share udf to proxy users + * @param currentCluster + * @param linkisUdfAdminUser + * @param proxyUserNames + * @param udfId + * @throws IOException + * @throws JSONException + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + void shareUdfToProxyUsers(String currentCluster, String linkisUdfAdminUser, List proxyUserNames, Long udfId) throws IOException, JSONException, UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * Delete udf + * @param enableClusterName + * @param linkisUdfId + * @param linkisUdfAdminUser + * @param fileName + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + void deleteUdf(String enableClusterName, Long linkisUdfId, String linkisUdfAdminUser, String fileName) + throws UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * Switch udf status + * @param enableClusterName + * @param linkisUdfId + * @param linkisUdfAdminUser + * @param isLoad + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + void switchUdfStatus(String enableClusterName, Long linkisUdfId, String linkisUdfAdminUser, Boolean isLoad) + throws UnExpectedRequestException, MetaDataAcquireFailedException; + + /** + * Get new version + * @param currentCluster + * @param linkisUdfAdminUser + * @param name + * @return + * @throws UnExpectedRequestException + * @throws IOException + * @throws JSONException + * @throws MetaDataAcquireFailedException + */ + String getUdfNewVersion(String currentCluster, String linkisUdfAdminUser, String name) + throws UnExpectedRequestException, IOException, JSONException, MetaDataAcquireFailedException; + + /** + * Deploy new version + * @param currentCluster + * @param linkisUdfAdminUser + * @param udfId + * @param version + * @throws UnExpectedRequestException + * @throws IOException + * @throws JSONException + * @throws MetaDataAcquireFailedException + */ + void deployUdfNewVersion(String currentCluster, String linkisUdfAdminUser, Long udfId, String version) + throws UnExpectedRequestException, IOException, JSONException, MetaDataAcquireFailedException; + + /** + * delete data source + * @param clusterName + * @param userName + * @param envId + * @return + * @throws UnExpectedRequestException + * @throws MetaDataAcquireFailedException + */ + GeneralResponse> deleteEnv(String clusterName, String userName, Long envId) throws UnExpectedRequestException, MetaDataAcquireFailedException; + } \ No newline at end of file diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/OperateCiService.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/OperateCiService.java new file mode 100644 index 00000000..180ceef7 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/OperateCiService.java @@ -0,0 +1,59 @@ +package com.webank.wedatasphere.qualitis.metadata.client; + +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; +import com.webank.wedatasphere.qualitis.metadata.response.DcnResponse; +import com.webank.wedatasphere.qualitis.metadata.response.CmdbDepartmentResponse; +import com.webank.wedatasphere.qualitis.metadata.response.DepartmentSubResponse; +import com.webank.wedatasphere.qualitis.metadata.response.ProductResponse; +import com.webank.wedatasphere.qualitis.metadata.response.SubSystemResponse; + +import com.webank.wedatasphere.qualitis.response.GeneralResponse; + +import java.util.List; + +/** + * @author allenzhou@webank.com + * @date 2021/3/2 10:53 + */ +public interface OperateCiService { + /** + * Get all sub_system info from http of cmdb, incloud: id,name,full english name. + * + * @return + * @throws UnExpectedRequestException + */ + List getAllSubSystemInfo() throws UnExpectedRequestException; + + /** + * Get all product info from http of cmdb, incloud: id,cn name. + * + * @return + * @throws UnExpectedRequestException + */ + List getAllProductInfo() throws UnExpectedRequestException; + + /** + * Get all department info from http of cmdb, incloud:department name. + * + * @return + * @throws UnExpectedRequestException + */ + List getAllDepartmetInfo() throws UnExpectedRequestException; + + /** + * Get dev and ops info + * + * @param deptCode + * @return + * @throws UnExpectedRequestException + */ + List getDevAndOpsInfo(Integer deptCode) throws UnExpectedRequestException; + + /** + * Get dcn + * @param subSystemId + * @return + * @throws UnExpectedRequestException + */ + GeneralResponse getDcn(Long subSystemId) throws UnExpectedRequestException; +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/RuleClient.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/RuleClient.java new file mode 100644 index 00000000..610c6a02 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/client/RuleClient.java @@ -0,0 +1,52 @@ +package com.webank.wedatasphere.qualitis.metadata.client; + +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; +import com.webank.wedatasphere.qualitis.metadata.exception.MetaDataAcquireFailedException; +import com.webank.wedatasphere.qualitis.metadata.response.DataInfo; +import com.webank.wedatasphere.qualitis.metadata.response.table.TableMetadataInfo; +import com.webank.wedatasphere.qualitis.metadata.response.table.TableTagInfo; + +/** + * @author v_minminghe@webank.com + * @date 2022-05-31 9:41 + * @description + */ +public interface RuleClient { + + /** + * get tag of table + * @param sourceType + * @param clusterType + * @param dbName + * @param tableName + * @param loginUser + * @return + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + */ + TableTagInfo getTableTag(String sourceType, String clusterType, String dbName, String tableName, String loginUser) throws MetaDataAcquireFailedException, UnExpectedRequestException; + + /** + * get MetaData by table + * @param sourceType + * @param clusterType + * @param dbName + * @param tableName + * @param loginUser + * @return + * @throws MetaDataAcquireFailedException + * @throws UnExpectedRequestException + */ + TableMetadataInfo getMetaData(String sourceType, String clusterType, String dbName, String tableName, String loginUser) throws MetaDataAcquireFailedException, UnExpectedRequestException; + + /** + * get tag list + * @param loginUser + * @param page + * @param size + * @return + * @throws MetaDataAcquireFailedException + */ + DataInfo getTagList(String loginUser, int page, int size) throws MetaDataAcquireFailedException, UnExpectedRequestException; + +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/exception/MetaDataAcquireFailedException.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/exception/MetaDataAcquireFailedException.java index a9b14ff3..c18be4ba 100644 --- a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/exception/MetaDataAcquireFailedException.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/exception/MetaDataAcquireFailedException.java @@ -43,7 +43,7 @@ public void setStatus(Integer status) { this.status = status; } - public GeneralResponse getResponse() { + public GeneralResponse getResponse() { return new GeneralResponse<>(this.status + "", getMessage(), null); } } diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/request/LinkisConnectParamsRequest.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/request/LinkisConnectParamsRequest.java new file mode 100644 index 00000000..d41f2e42 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/request/LinkisConnectParamsRequest.java @@ -0,0 +1,126 @@ +package com.webank.wedatasphere.qualitis.metadata.request; + +import com.fasterxml.jackson.annotation.JsonProperty; + +import java.util.List; + +/** + * @author v_minminghe@webank.com + * @date 2023-05-12 11:12 + * @description + */ +public class LinkisConnectParamsRequest { + private String username; + private String password; + private String host; + private String port; + /** + * 连接参数 + */ + private String connectParam; + private String appId; + private String authType; + private String objectId; + private String mkPrivate; + private String dk; + @JsonProperty(value = "timestamp") + private String timeStamp; + + public String getTimeStamp() { + return timeStamp; + } + + public void setTimeStamp(String timeStamp) { + this.timeStamp = timeStamp; + } + + private List envIdArray; + + public String getDk() { + return dk; + } + + public void setDk(String dk) { + this.dk = dk; + } + + public String getConnectParam() { + return connectParam; + } + + public void setConnectParam(String connectParam) { + this.connectParam = connectParam; + } + + public String getHost() { + return host; + } + + public void setHost(String host) { + this.host = host; + } + + public String getPort() { + return port; + } + + public void setPort(String port) { + this.port = port; + } + + public List getEnvIdArray() { + return envIdArray; + } + + public void setEnvIdArray(List envIdArray) { + this.envIdArray = envIdArray; + } + + public String getAuthType() { + return authType; + } + + public void setAuthType(String authType) { + this.authType = authType; + } + + public String getUsername() { + return username; + } + + public void setUsername(String username) { + this.username = username; + } + + public String getPassword() { + return password; + } + + public void setPassword(String password) { + this.password = password; + } + + public String getAppId() { + return appId; + } + + public void setAppId(String appId) { + this.appId = appId; + } + + public String getObjectId() { + return objectId; + } + + public void setObjectId(String objectId) { + this.objectId = objectId; + } + + public String getMkPrivate() { + return mkPrivate; + } + + public void setMkPrivate(String mkPrivate) { + this.mkPrivate = mkPrivate; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/request/LinkisDataSourceEnvRequest.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/request/LinkisDataSourceEnvRequest.java new file mode 100644 index 00000000..bf203330 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/request/LinkisDataSourceEnvRequest.java @@ -0,0 +1,79 @@ +package com.webank.wedatasphere.qualitis.metadata.request; + +import com.fasterxml.jackson.annotation.JsonIgnore; + +import java.util.HashMap; +import java.util.Map; + +/** + * @author v_minminghe@webank.com + * @date 2023-05-12 11:00 + * @description + */ +public class LinkisDataSourceEnvRequest { + + private Long id; + private String envName; + private String envDesc; + private Long dataSourceTypeId; + private String database; + private Map connectParams = new HashMap<>(); + @JsonIgnore + private LinkisConnectParamsRequest connectParamsRequest; + + public String getDatabase() { + return database; + } + + public void setDatabase(String database) { + this.database = database; + } + + public Long getDataSourceTypeId() { + return dataSourceTypeId; + } + + public void setDataSourceTypeId(Long dataSourceTypeId) { + this.dataSourceTypeId = dataSourceTypeId; + } + + public String getEnvName() { + return envName; + } + + public void setEnvName(String envName) { + this.envName = envName; + } + + public String getEnvDesc() { + return envDesc; + } + + public void setEnvDesc(String envDesc) { + this.envDesc = envDesc; + } + + public Map getConnectParams() { + return connectParams; + } + + public void setConnectParams(Map connectParams) { + this.connectParams = connectParams; + } + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public LinkisConnectParamsRequest getConnectParamsRequest() { + return connectParamsRequest; + } + + public void setConnectParamsRequest(LinkisConnectParamsRequest connectParamsRequest) { + this.connectParamsRequest = connectParamsRequest; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/request/LinkisDataSourceRequest.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/request/LinkisDataSourceRequest.java new file mode 100644 index 00000000..9bccef2c --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/request/LinkisDataSourceRequest.java @@ -0,0 +1,120 @@ +package com.webank.wedatasphere.qualitis.metadata.request; + +import com.fasterxml.jackson.annotation.JsonIgnore; + +import java.util.HashMap; +import java.util.Map; + +/** + * @author v_minminghe@webank.com + * @date 2023-05-12 11:00 + * @description + */ +public class LinkisDataSourceRequest { + + @JsonIgnore + private String subSystem; + @JsonIgnore + private Long linkisDataSourceId; + @JsonIgnore + private Integer inputType; + @JsonIgnore + private Integer verifyType; + private Long dataSourceTypeId; + private String dataSourceName; + private String dataSourceDesc; + private String labels; + private String createSystem = "Qualitis"; + private Map connectParams = new HashMap<>(); + @JsonIgnore + private LinkisConnectParamsRequest sharedConnectParams; + + public String getSubSystem() { + return subSystem; + } + + public void setSubSystem(String subSystem) { + this.subSystem = subSystem; + } + + public Long getLinkisDataSourceId() { + return linkisDataSourceId; + } + + public void setLinkisDataSourceId(Long linkisDataSourceId) { + this.linkisDataSourceId = linkisDataSourceId; + } + + public Map getConnectParams() { + return connectParams; + } + + public void setConnectParams(Map connectParams) { + this.connectParams = connectParams; + } + + public LinkisConnectParamsRequest getSharedConnectParams() { + return sharedConnectParams; + } + + public void setSharedConnectParams(LinkisConnectParamsRequest sharedConnectParams) { + this.sharedConnectParams = sharedConnectParams; + } + + public Long getDataSourceTypeId() { + return dataSourceTypeId; + } + + public void setDataSourceTypeId(Long dataSourceTypeId) { + this.dataSourceTypeId = dataSourceTypeId; + } + + public String getDataSourceName() { + return dataSourceName; + } + + public void setDataSourceName(String dataSourceName) { + this.dataSourceName = dataSourceName; + } + + public String getDataSourceDesc() { + return dataSourceDesc; + } + + public void setDataSourceDesc(String dataSourceDesc) { + this.dataSourceDesc = dataSourceDesc; + } + + public String getLabels() { + return labels; + } + + public void setLabels(String labels) { + this.labels = labels; + } + + public Integer getInputType() { + return inputType; + } + + public void setInputType(Integer inputType) { + this.inputType = inputType; + } + + public Integer getVerifyType() { + return verifyType; + } + + public void setVerifyType(Integer verifyType) { + this.verifyType = verifyType; + } + + public String getCreateSystem() { + return createSystem; + } + + public void setCreateSystem(String createSystem) { + this.createSystem = createSystem; + } + +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/request/ModifyDataSourceParameterRequest.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/request/ModifyDataSourceParameterRequest.java new file mode 100644 index 00000000..57f173c6 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/request/ModifyDataSourceParameterRequest.java @@ -0,0 +1,50 @@ +package com.webank.wedatasphere.qualitis.metadata.request; + +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; + +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +/** + * @author v_minminghe@webank.com + * @date 2023-05-15 9:31 + * @description + */ +public class ModifyDataSourceParameterRequest { + + private Long linkisDataSourceId; + private String comment; + private Map connectParams; + + public Long getLinkisDataSourceId() { + return linkisDataSourceId; + } + + public void setLinkisDataSourceId(Long linkisDataSourceId) { + this.linkisDataSourceId = linkisDataSourceId; + } + + public String getComment() { + return comment; + } + + public void setComment(String comment) { + this.comment = comment; + } + + public Map getConnectParams() { + return connectParams; + } + + public void setConnectParams(Map connectParams) { + this.connectParams = connectParams; + } + + public void setEnvIdArray(List envIdArray) throws UnExpectedRequestException { + if (null == connectParams) { + throw new UnExpectedRequestException("Field {connectParams} is null"); + } + connectParams.put("envIdArray", envIdArray); + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/CmdbDepartmentResponse.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/CmdbDepartmentResponse.java new file mode 100644 index 00000000..77e79f3d --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/CmdbDepartmentResponse.java @@ -0,0 +1,41 @@ +package com.webank.wedatasphere.qualitis.metadata.response; + +import com.fasterxml.jackson.annotation.JsonProperty; + +/** + * @author v_minminghe@webank.com + * @date 2022-09-20 16:13 + * @description + */ +public class CmdbDepartmentResponse { + + @JsonProperty("department_code") + private String code; + @JsonProperty("department_name") + private String name; + private String disable; + + public String getDisable() { + return disable; + } + + public void setDisable(String disable) { + this.disable = disable; + } + + public String getCode() { + return code; + } + + public void setCode(String code) { + this.code = code; + } + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DataInfo.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DataInfo.java index 5446069d..55c3541e 100644 --- a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DataInfo.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DataInfo.java @@ -16,6 +16,8 @@ package com.webank.wedatasphere.qualitis.metadata.response; +import com.fasterxml.jackson.annotation.JsonProperty; + import java.util.List; /** @@ -23,12 +25,22 @@ */ public class DataInfo { private int totalCount; + @JsonProperty("env_names") + private List envNames; private List content; public DataInfo() { // Default Constructor } + public List getEnvNames() { + return envNames; + } + + public void setEnvNames(List envNames) { + this.envNames = envNames; + } + public DataInfo(int total) { this.totalCount = total; } diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DataMapResultInfo.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DataMapResultInfo.java new file mode 100644 index 00000000..ca6ad8bf --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DataMapResultInfo.java @@ -0,0 +1,47 @@ +package com.webank.wedatasphere.qualitis.metadata.response; + +/** + * @author v_minminghe@webank.com + * @date 2022-06-01 15:19 + * @description + */ +public class DataMapResultInfo { + + private String code; + private String msg; + private T data; + + public DataMapResultInfo() { + + } + + public DataMapResultInfo(String code, String msg, T data) { + this.code = code; + this.msg = msg; + this.data = data; + } + + public String getCode() { + return code; + } + + public void setCode(String code) { + this.code = code; + } + + public String getMsg() { + return msg; + } + + public void setMsg(String msg) { + this.msg = msg; + } + + public T getData() { + return data; + } + + public void setData(T data) { + this.data = data; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DcnResponse.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DcnResponse.java new file mode 100644 index 00000000..2e0f9017 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DcnResponse.java @@ -0,0 +1,48 @@ +package com.webank.wedatasphere.qualitis.metadata.response; + +import com.google.common.collect.Maps; +import com.webank.wedatasphere.qualitis.util.map.CustomObjectMapper; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.util.List; +import java.util.Map; +import java.util.Objects; +import java.util.stream.Collectors; + +/** + * @author allenzhou@webank.com + * @date 2023/4/25 14:45 + */ +public class DcnResponse { + + private final Logger LOGGER = LoggerFactory.getLogger(DcnResponse.class); + + Map>>> res = Maps.newHashMap(); + + public DcnResponse() { + // default + } + + public DcnResponse(List> maps) { + this.res = maps.stream().filter(map -> { + boolean legalDcn = Objects.nonNull(map.get("idc")) && Objects.nonNull(map.get("logic_dcn")); + if (!legalDcn) { + LOGGER.warn("idc or logic dcn is null, data: {}", CustomObjectMapper.transObjectToJson(map)); + } + return legalDcn; + }) + .collect(Collectors.groupingBy( + map -> map.get("idc"), + Collectors.groupingBy(map -> map.get("logic_dcn") + ))); + } + + public Map>>> getRes() { + return res; + } + + public void setRes(Map>>> res) { + this.res = res; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DepartmentSubResponse.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DepartmentSubResponse.java new file mode 100644 index 00000000..2d1810fe --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/DepartmentSubResponse.java @@ -0,0 +1,32 @@ +package com.webank.wedatasphere.qualitis.metadata.response; + +import com.fasterxml.jackson.annotation.JsonProperty; + +/** + * @author v_minminghe@webank.com + * @date 2022-09-21 10:19 + * @description + */ +public class DepartmentSubResponse { + + @JsonProperty("department_sub_id") + private String id; + @JsonProperty("department_sub_name") + private String name; + + public String getId() { + return id; + } + + public void setId(String id) { + this.id = id; + } + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/ProductResponse.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/ProductResponse.java new file mode 100644 index 00000000..c4a29e11 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/ProductResponse.java @@ -0,0 +1,26 @@ +package com.webank.wedatasphere.qualitis.metadata.response; + +/** + * @author allenzhou@webank.com + * @date 2021/3/2 10:54 + */ +public class ProductResponse { + private String productId; + private String productName; + + public String getProductId() { + return productId; + } + + public void setProductId(String productId) { + this.productId = productId; + } + + public String getProductName() { + return productName; + } + + public void setProductName(String productName) { + this.productName = productName; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/SubSystemResponse.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/SubSystemResponse.java new file mode 100644 index 00000000..9b6609da --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/SubSystemResponse.java @@ -0,0 +1,69 @@ +package com.webank.wedatasphere.qualitis.metadata.response; + +import com.fasterxml.jackson.annotation.JsonProperty; + +/** + * @author allenzhou@webank.com + * @date 2021/3/2 10:54 + */ +public class SubSystemResponse { + private Integer subSystemId; + private String subSystemName; + private String subSystemFullCnName; + + @JsonProperty("department_name") + private String departmentName; + + @JsonProperty("dev_department_name") + private String devDepartmentName; + @JsonProperty("ops_department_name") + private String opsDepartmentName; + + public Integer getSubSystemId() { + return subSystemId; + } + + public void setSubSystemId(Integer subSystemId) { + this.subSystemId = subSystemId; + } + + public String getSubSystemName() { + return subSystemName; + } + + public void setSubSystemName(String subSystemName) { + this.subSystemName = subSystemName; + } + + public String getSubSystemFullCnName() { + return subSystemFullCnName; + } + + public void setSubSystemFullCnName(String subSystemFullCnName) { + this.subSystemFullCnName = subSystemFullCnName; + } + + public String getDepartmentName() { + return departmentName; + } + + public void setDepartmentName(String departmentName) { + this.departmentName = departmentName; + } + + public String getDevDepartmentName() { + return devDepartmentName; + } + + public void setDevDepartmentName(String devDepartmentName) { + this.devDepartmentName = devDepartmentName; + } + + public String getOpsDepartmentName() { + return opsDepartmentName; + } + + public void setOpsDepartmentName(String opsDepartmentName) { + this.opsDepartmentName = opsDepartmentName; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/datasource/LinkisDataSourceInfoDetail.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/datasource/LinkisDataSourceInfoDetail.java new file mode 100644 index 00000000..18686be9 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/datasource/LinkisDataSourceInfoDetail.java @@ -0,0 +1,181 @@ +package com.webank.wedatasphere.qualitis.metadata.response.datasource; + +import java.util.Date; +import java.util.HashMap; +import java.util.Map; + +/** + * @author v_minminghe@webank.com + * @date 2022-08-30 15:19 + * @description + */ +public class LinkisDataSourceInfoDetail { + + private Long id; + /** Data source name */ + private String dataSourceName; + + /** Data source description */ + private String dataSourceDesc; + + /** ID of data source type */ + private Long dataSourceTypeId; + + /** Identify from creator */ + private String createIdentify; + + /** System name from creator */ + private String createSystem; + /** Connection parameters */ + private Map connectParams = new HashMap<>(); + + /** Create time */ + private Date createTime; + + /** Modify time */ + private Date modifyTime; + + /** Modify user */ + private String modifyUser; + + private String createUser; + + private String labels; + + private Long versionId; + + private Long publishedVersionId; + + private boolean expire; + + /** Data source type entity */ + private LinkisDataSourceTypeDetail dataSourceType; + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public String getDataSourceName() { + return dataSourceName; + } + + public void setDataSourceName(String dataSourceName) { + this.dataSourceName = dataSourceName; + } + + public String getDataSourceDesc() { + return dataSourceDesc; + } + + public void setDataSourceDesc(String dataSourceDesc) { + this.dataSourceDesc = dataSourceDesc; + } + + public Long getDataSourceTypeId() { + return dataSourceTypeId; + } + + public void setDataSourceTypeId(Long dataSourceTypeId) { + this.dataSourceTypeId = dataSourceTypeId; + } + + public String getCreateIdentify() { + return createIdentify; + } + + public void setCreateIdentify(String createIdentify) { + this.createIdentify = createIdentify; + } + + public String getCreateSystem() { + return createSystem; + } + + public void setCreateSystem(String createSystem) { + this.createSystem = createSystem; + } + + public Map getConnectParams() { + return connectParams; + } + + public void setConnectParams(Map connectParams) { + this.connectParams = connectParams; + } + + public Date getCreateTime() { + return createTime; + } + + public void setCreateTime(Date createTime) { + this.createTime = createTime; + } + + public Date getModifyTime() { + return modifyTime; + } + + public void setModifyTime(Date modifyTime) { + this.modifyTime = modifyTime; + } + + public String getModifyUser() { + return modifyUser; + } + + public void setModifyUser(String modifyUser) { + this.modifyUser = modifyUser; + } + + public String getCreateUser() { + return createUser; + } + + public void setCreateUser(String createUser) { + this.createUser = createUser; + } + + public String getLabels() { + return labels; + } + + public void setLabels(String labels) { + this.labels = labels; + } + + public Long getVersionId() { + return versionId; + } + + public void setVersionId(Long versionId) { + this.versionId = versionId; + } + + public Long getPublishedVersionId() { + return publishedVersionId; + } + + public void setPublishedVersionId(Long publishedVersionId) { + this.publishedVersionId = publishedVersionId; + } + + public boolean isExpire() { + return expire; + } + + public void setExpire(boolean expire) { + this.expire = expire; + } + + public LinkisDataSourceTypeDetail getDataSourceType() { + return dataSourceType; + } + + public void setDataSourceType(LinkisDataSourceTypeDetail dataSourceType) { + this.dataSourceType = dataSourceType; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/datasource/LinkisDataSourceParamsResponse.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/datasource/LinkisDataSourceParamsResponse.java new file mode 100644 index 00000000..e7c17225 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/datasource/LinkisDataSourceParamsResponse.java @@ -0,0 +1,30 @@ +package com.webank.wedatasphere.qualitis.metadata.response.datasource; + +import com.fasterxml.jackson.annotation.JsonProperty; + +/** + * @author v_minminghe@webank.com + * @date 2023-05-15 9:49 + * @description + */ +public class LinkisDataSourceParamsResponse { + + @JsonProperty(value = "version") + private Long versionId; + + public LinkisDataSourceParamsResponse() { +// Doing something + } + + public LinkisDataSourceParamsResponse(Long versionId) { + this.versionId = versionId; + } + + public Long getVersionId() { + return versionId; + } + + public void setVersionId(Long versionId) { + this.versionId = versionId; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/datasource/LinkisDataSourceTypeDetail.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/datasource/LinkisDataSourceTypeDetail.java new file mode 100644 index 00000000..272f157f --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/datasource/LinkisDataSourceTypeDetail.java @@ -0,0 +1,83 @@ +package com.webank.wedatasphere.qualitis.metadata.response.datasource; + +/** + * @author v_minminghe@webank.com + * @date 2022-08-30 15:21 + * @description + */ +public class LinkisDataSourceTypeDetail { + + private String id; + /** Name */ + private String name; + /** Description */ + private String description; + /** The display name of the type */ + private String option; + /** classifier */ + private String classifier; + /** Icon url */ + private String icon; + /** + * Tells the user the number of levels for the datasource eg: for mysql/hive/presto datasource: + * (datasource) --> database --> tables --> column 3 for kafka datasource: (datasource) --> topic + * --> partition 2 + */ + private int layers; + + public String getId() { + return id; + } + + public void setId(String id) { + this.id = id; + } + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } + + public String getDescription() { + return description; + } + + public void setDescription(String description) { + this.description = description; + } + + public String getOption() { + return option; + } + + public void setOption(String option) { + this.option = option; + } + + public String getClassifier() { + return classifier; + } + + public void setClassifier(String classifier) { + this.classifier = classifier; + } + + public String getIcon() { + return icon; + } + + public void setIcon(String icon) { + this.icon = icon; + } + + public int getLayers() { + return layers; + } + + public void setLayers(int layers) { + this.layers = layers; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/PartitionStatisticsInfo.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/PartitionStatisticsInfo.java index 0a44219d..e66ea87c 100644 --- a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/PartitionStatisticsInfo.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/PartitionStatisticsInfo.java @@ -29,7 +29,9 @@ public class PartitionStatisticsInfo { @JsonProperty("partition_child_count") private int partitionChildCount; @JsonProperty("partitions") - private List partitions; + private List> partitions; + @JsonProperty("modificationTime") + private Long modificationTime; public PartitionStatisticsInfo() { // Default Constructor @@ -51,11 +53,19 @@ public void setPartitionChildCount(int partitionChildCount) { this.partitionChildCount = partitionChildCount; } - public List getPartitions() { + public List> getPartitions() { return partitions; } - public void setPartitions(List partitions) { + public void setPartitions(List> partitions) { this.partitions = partitions; } + + public Long getModificationTime() { + return modificationTime; + } + + public void setModificationTime(Long modificationTime) { + this.modificationTime = modificationTime; + } } diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/TableMetadataInfo.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/TableMetadataInfo.java new file mode 100644 index 00000000..a150bd7c --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/TableMetadataInfo.java @@ -0,0 +1,132 @@ +package com.webank.wedatasphere.qualitis.metadata.response.table; + +import java.util.List; +import java.util.Map; + +/** + * @author v_minminghe@webank.com + * @date 2022-06-01 11:36 + * @description + */ +public class TableMetadataInfo { + + private String searchType; + private String source; + private String rawName; + private String urn; + private List subSystemSet; + private List> pathList; + private String busDept; + private String devDept; + + public static TableMetadataInfo build(){ + return new TableMetadataInfo(); + } + + public String getSearchType() { + return searchType; + } + + public void setSearchType(String searchType) { + this.searchType = searchType; + } + + public String getSource() { + return source; + } + + public void setSource(String source) { + this.source = source; + } + + public String getRawName() { + return rawName; + } + + public void setRawName(String rawName) { + this.rawName = rawName; + } + + public String getUrn() { + return urn; + } + + public void setUrn(String urn) { + this.urn = urn; + } + + public List getSubSystemSet() { + return subSystemSet; + } + + public void setSubSystemSet(List subSystemSet) { + this.subSystemSet = subSystemSet; + } + + public String getBusDept() { + return busDept; + } + + public void setBusDept(String busDept) { + this.busDept = busDept; + } + + public String getDevDept() { + return devDept; + } + + public void setDevDept(String devDept) { + this.devDept = devDept; + } + + public List> getPathList() { + return pathList; + } + + public void setPathList(List> pathList) { + this.pathList = pathList; + } + + @Override + public String toString() { + return "TableDeptInfo{" + + "searchType='" + searchType + '\'' + + ", source='" + source + '\'' + + ", rawName='" + rawName + '\'' + + ", urn='" + urn + '\'' + + ", subSystemSet=" + subSystemSet + + ", busDept='" + busDept + '\'' + + ", devDept='" + devDept + '\'' + + '}'; + } + + public static class SubSystem { + + private String fid; + private String name; + + public String getFid() { + return fid; + } + + public void setFid(String fid) { + this.fid = fid; + } + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } + + @Override + public String toString() { + return "SubSystem{" + + "fid='" + fid + '\'' + + ", name='" + name + '\'' + + '}'; + } + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/TableStatisticsInfo.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/TableStatisticsInfo.java index 023fb057..57cb0440 100644 --- a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/TableStatisticsInfo.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/TableStatisticsInfo.java @@ -29,7 +29,7 @@ public class TableStatisticsInfo { @JsonProperty("table_file_count") private int tableFileCount; @JsonProperty("partitions") - private List partitions; + private List> partitions; public TableStatisticsInfo() { // Default Constructor @@ -51,11 +51,11 @@ public void setTableFileCount(int tableFileCount) { this.tableFileCount = tableFileCount; } - public List getPartitions() { + public List> getPartitions() { return partitions; } - public void setPartitions(List partitions) { + public void setPartitions(List> partitions) { this.partitions = partitions; } } diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/TableTagInfo.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/TableTagInfo.java new file mode 100644 index 00000000..4c2bed64 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/metadata/response/table/TableTagInfo.java @@ -0,0 +1,98 @@ +package com.webank.wedatasphere.qualitis.metadata.response.table; + +import com.fasterxml.jackson.annotation.JsonProperty; + +/** + * @author v_minminghe@webank.com + * @date 2022-06-01 11:31 + * @description + */ +public class TableTagInfo { + + private Integer id; + @JsonProperty("source_type") + private String sourceType; + @JsonProperty("cluster_type") + private String clusterType; + @JsonProperty("db_code") + private String dbCode; + @JsonProperty("dataset_name") + private String datasetName; + @JsonProperty("tag_code") + private String tagCode; + @JsonProperty("tag_name") + private String tagName; + + public static TableTagInfo build() { + return new TableTagInfo(); + } + + public Integer getId() { + return id; + } + + public void setId(Integer id) { + this.id = id; + } + + public String getSourceType() { + return sourceType; + } + + public void setSourceType(String sourceType) { + this.sourceType = sourceType; + } + + public String getClusterType() { + return clusterType; + } + + public void setClusterType(String clusterType) { + this.clusterType = clusterType; + } + + public String getDbCode() { + return dbCode; + } + + public void setDbCode(String dbCode) { + this.dbCode = dbCode; + } + + public String getDatasetName() { + return datasetName; + } + + public void setDatasetName(String datasetName) { + this.datasetName = datasetName; + } + + public String getTagCode() { + return tagCode; + } + + public void setTagCode(String tagCode) { + this.tagCode = tagCode; + } + + public String getTagName() { + return tagName; + } + + public void setTagName(String tagName) { + this.tagName = tagName; + } + + @Override + public String toString() { + return "TableTagInfo{" + + "id=" + id + + ", sourceType='" + sourceType + '\'' + + ", clusterType='" + clusterType + '\'' + + ", dbCode='" + dbCode + '\'' + + ", datasetName='" + datasetName + '\'' + + ", tagCode='" + tagCode + '\'' + + ", tagName='" + tagName + '\'' + + '}'; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/response/DmsGeneralResponse.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/response/DmsGeneralResponse.java new file mode 100644 index 00000000..4da9b2f7 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/response/DmsGeneralResponse.java @@ -0,0 +1,63 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.response; + +import java.io.Serializable; + +/** + * @author howeye + */ +public class DmsGeneralResponse implements Serializable { + private static final long serialVersionUID = 2405172041950251807L; + + private Integer retCode; + private String message; + private T data; + + public DmsGeneralResponse() { + } + + public DmsGeneralResponse(Integer retCode, String message, T data) { + this.retCode = retCode; + this.message = message; + this.data = data; + } + + public Integer getRetCode() { + return retCode; + } + + public void setRetCode(Integer retCode) { + this.retCode = retCode; + } + + public String getMessage() { + return message; + } + + public void setMessage(String message) { + this.message = message; + } + + public T getData() { + return data; + } + + public void setData(T data) { + this.data = data; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/response/GeneralResponse.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/response/GeneralResponse.java index 2652989e..a60450f2 100644 --- a/core/common/src/main/java/com/webank/wedatasphere/qualitis/response/GeneralResponse.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/response/GeneralResponse.java @@ -22,8 +22,10 @@ * @author howeye */ public class GeneralResponse implements Serializable { - private String code; + private static final long serialVersionUID = 2405172041950251807L; + private String message; + private String code; private T data; public GeneralResponse() { diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/CryptoUtils.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/CryptoUtils.java new file mode 100644 index 00000000..0264a2d3 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/CryptoUtils.java @@ -0,0 +1,46 @@ +package com.webank.wedatasphere.qualitis.util; + +import org.apache.commons.codec.binary.Base64; + +import java.io.ByteArrayInputStream; +import java.io.ByteArrayOutputStream; +import java.io.ObjectInputStream; +import java.io.ObjectOutputStream; +import java.io.Serializable; + +/** + * @author v_minminghe@webank.com + * @date 2022-08-11 17:49 + * @description + */ +public class CryptoUtils { + + public static String encode(Serializable o) { + try { + ByteArrayOutputStream bos = new ByteArrayOutputStream(); + ObjectOutputStream oos = new ObjectOutputStream(bos); + oos.writeObject(o); + oos.flush(); + oos.close(); + bos.close(); + return new String(new Base64().encode(bos.toByteArray())); + } catch (Exception e) { + throw new RuntimeException(e.getMessage(), e); + } + } + + public static Object decode(String str) { + try { + ByteArrayInputStream bis = + new ByteArrayInputStream(new Base64().decode(str.getBytes("UTF-8"))); + ObjectInputStream ois = new ObjectInputStream(bis); + Object o = ois.readObject(); + bis.close(); + ois.close(); + return o; + } catch (Exception e) { + throw new RuntimeException(e.getMessage(), e); + } + } + +} diff --git a/web/common/src/main/java/com/webank/wedatasphere/qualitis/util/DateUtils.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/DateUtils.java similarity index 72% rename from web/common/src/main/java/com/webank/wedatasphere/qualitis/util/DateUtils.java rename to core/common/src/main/java/com/webank/wedatasphere/qualitis/util/DateUtils.java index 437a0507..16bc8aaa 100644 --- a/web/common/src/main/java/com/webank/wedatasphere/qualitis/util/DateUtils.java +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/DateUtils.java @@ -16,6 +16,8 @@ package com.webank.wedatasphere.qualitis.util; +import org.apache.commons.lang.time.FastDateFormat; + import java.math.BigDecimal; import java.util.Date; @@ -25,6 +27,8 @@ */ public class DateUtils { + private static final FastDateFormat PRINT_TIME_FORMAT = FastDateFormat.getInstance("yyyy-MM-dd HH:mm:ss"); + private DateUtils() { // Default Constructor } @@ -34,4 +38,14 @@ public static int getDayDiffBetween(Date startDate, Date endDate) { float days = (float) diff/(1000 * 60 * 60 * 24); return BigDecimal.valueOf(days).setScale(1, BigDecimal.ROUND_HALF_UP).intValue(); } + + public static String now() { + return PRINT_TIME_FORMAT.format(new Date()); + } + + public static String now(String format) { + FastDateFormat printTimeFormat = FastDateFormat.getInstance(format); + return printTimeFormat.format(new Date()); + } + } diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/ExecutorConfig.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/ExecutorConfig.java new file mode 100644 index 00000000..1816a1a8 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/ExecutorConfig.java @@ -0,0 +1,51 @@ +package com.webank.wedatasphere.qualitis.util; + +import groovy.util.logging.Slf4j; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.context.annotation.Bean; +import org.springframework.context.annotation.Configuration; +import org.springframework.scheduling.annotation.EnableAsync; +import org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor; + +import java.util.concurrent.Executor; +import java.util.concurrent.ThreadPoolExecutor; + +/** + * @author v_gaojiedeng@webank.com + */ + +@Configuration +@EnableAsync +@Slf4j +public class ExecutorConfig { + private static final Logger log = LoggerFactory.getLogger(ExecutorConfig.class); + + private static final int CORE_POOL_SIZE = 50; + private static final int MAX_POOL_SIZE = Integer.MAX_VALUE; + private static final int QUEUE_CAPACITY = 99988; + private static final String NAME_PREFIX = "async-importDB-"; + + @Bean(name = "asyncServiceExecutor") + public Executor asyncServiceExecutor() { + + log.warn("start asyncServiceExecutor"); + //在这里修改 + ThreadPoolTaskExecutor executor = new VisiableThreadPoolTaskExecutor(); + //配置核心线程数 + executor.setCorePoolSize(CORE_POOL_SIZE); + //配置最大线程数 + executor.setMaxPoolSize(MAX_POOL_SIZE); + //配置队列大小 + executor.setQueueCapacity(QUEUE_CAPACITY); + //配置线程池中的线程的名称前缀 + executor.setThreadNamePrefix(NAME_PREFIX); + // rejection-policy:当pool已经达到max size的时候,如何处理新任务 + // CALLER_RUNS:不在新线程中执行任务,而是有调用者所在的线程来执行 + executor.setRejectedExecutionHandler(new ThreadPoolExecutor.CallerRunsPolicy()); + //执行初始化 + executor.initialize(); + return executor; + } + +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/SpringContextHolder.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/SpringContextHolder.java new file mode 100644 index 00000000..0c791eec --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/SpringContextHolder.java @@ -0,0 +1,76 @@ +package com.webank.wedatasphere.qualitis.util; + +import groovy.util.logging.Slf4j; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.beans.BeansException; +import org.springframework.beans.factory.DisposableBean; +import org.springframework.context.ApplicationContext; +import org.springframework.context.ApplicationContextAware; +import org.springframework.context.annotation.Lazy; +import org.springframework.stereotype.Service; + +/** + * @author v_gaojiedeng + */ +@Service +@Lazy(false) +@Slf4j +public class SpringContextHolder implements ApplicationContextAware, DisposableBean { + + private static final Logger LOGGER = LoggerFactory.getLogger(SpringContextHolder.class); + + private static ApplicationContext applicationContext = null; + + /** + * 从静态变量applicationContext中取得Bean, 自动转型为所赋值对象的类型. + */ + @SuppressWarnings("unchecked") + public static T getBean(String name) { + assertContextInjected(); + return (T) applicationContext.getBean(name); + } + + /** + * 从静态变量applicationContext中取得Bean, 自动转型为所赋值对象的类型. + */ + public static T getBean(Class requiredType) { + assertContextInjected(); + return applicationContext.getBean(requiredType); + } + + /** + * 检查ApplicationContext不为空. + */ + private static void assertContextInjected() { + if (applicationContext == null) { + throw new IllegalStateException("applicaitonContext属性未注入, 请在applicationContext" + ".xml中定义SpringContextHolder或在SpringBoot启动类中注册SpringContextHolder."); + } + } + + /** + * 清除SpringContextHolder中的ApplicationContext为Null. + */ + private static void clearHolder() { + LOGGER.debug("清除SpringContextHolder中的ApplicationContext:" + applicationContext); + applicationContext = null; + } + + @Override + public void destroy(){ + SpringContextHolder.clearHolder(); + } + + @Override + public void setApplicationContext(ApplicationContext applicationContext) throws BeansException { + if (SpringContextHolder.applicationContext != null) { + LOGGER.warn("SpringContextHolder中的ApplicationContext被覆盖, 原有ApplicationContext为:" + SpringContextHolder.applicationContext); + } + SpringContextHolder.applicationContext = applicationContext; + } + + public static String getActiveProfile() { + assertContextInjected(); + return applicationContext.getEnvironment().getActiveProfiles()[0]; + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/VisiableThreadPoolTaskExecutor.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/VisiableThreadPoolTaskExecutor.java new file mode 100644 index 00000000..c8c65d7e --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/VisiableThreadPoolTaskExecutor.java @@ -0,0 +1,71 @@ +package com.webank.wedatasphere.qualitis.util; + +import groovy.util.logging.Slf4j; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor; +import org.springframework.util.concurrent.ListenableFuture; + +import java.util.concurrent.Callable; +import java.util.concurrent.Future; +import java.util.concurrent.ThreadPoolExecutor; + +/** + * @author v_gaojiedeng@webank.com + */ +@Slf4j +public class VisiableThreadPoolTaskExecutor extends ThreadPoolTaskExecutor { + + private static final Logger logger = LoggerFactory.getLogger(VisiableThreadPoolTaskExecutor.class); + + private void printCurrentThreadPoolInfo () { + ThreadPoolExecutor threadPoolExecutor = getThreadPoolExecutor(); + if (null == threadPoolExecutor) { + logger.info("当前异步线程池未完成初始化..."); + return; + } + logger.info("当前线程池情况:名称前缀-{},任务总数-[{}],已完成的任务总数-[{}],可调度执行的工作线程总数-[{}],任务队列大小-[{}]", + this.getThreadNamePrefix(), + threadPoolExecutor.getTaskCount(), + threadPoolExecutor.getCompletedTaskCount(), + threadPoolExecutor.getActiveCount(), + threadPoolExecutor.getQueue().size() + ); + } + + @Override + public void execute(Runnable task) { + printCurrentThreadPoolInfo(); + super.execute(task); + } + + @Override + public void execute(Runnable task, long startTimeout) { + printCurrentThreadPoolInfo(); + super.execute(task, startTimeout); + } + + @Override + public Future submit(Runnable task) { + printCurrentThreadPoolInfo(); + return super.submit(task); + } + + @Override + public Future submit(Callable task) { + printCurrentThreadPoolInfo(); + return super.submit(task); + } + + @Override + public ListenableFuture submitListenable(Runnable task) { + printCurrentThreadPoolInfo(); + return super.submitListenable(task); + } + + @Override + public ListenableFuture submitListenable(Callable task) { + printCurrentThreadPoolInfo(); + return super.submitListenable(task); + } +} diff --git a/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/map/CustomObjectMapper.java b/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/map/CustomObjectMapper.java new file mode 100644 index 00000000..43a7ae09 --- /dev/null +++ b/core/common/src/main/java/com/webank/wedatasphere/qualitis/util/map/CustomObjectMapper.java @@ -0,0 +1,546 @@ +package com.webank.wedatasphere.qualitis.util.map; + +import com.fasterxml.jackson.databind.JavaType; +import com.fasterxml.jackson.databind.ObjectMapper; +import com.fasterxml.jackson.databind.type.CollectionType; +import com.fasterxml.jackson.databind.type.MapType; + +import java.text.SimpleDateFormat; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.Collection; +import java.util.Collections; +import java.util.LinkedHashMap; +import java.util.List; +import java.util.Map; +import java.util.Map.Entry; + +/** + * @author v_gaojiedeng + */ +public class CustomObjectMapper extends ObjectMapper { + + private static final long serialVersionUID = 7112167235632741571L; + + private static final String LEFT_BRACKET = "["; + private static final String RIGHT_BRACKET = "]"; + private static final String LEFT_SQUARE_BRACKETS = "{"; + private static final String RIGHT_SQUARE_BRACKETS = "}"; + private static final String JOINT_MARK = "\""; + + private static CustomObjectMapper defaultInstance; + + private static final String DATE_PATTERN = "yyyy-MM-dd"; + private static final String DATETIME_PATTERN = "yyyy-MM-dd HH:mm:ss"; + private static final String TIMESTAMP_PATTERN = "yyyy-MM-dd HH:mm:ss.SSS"; + + protected static final ThreadLocal DATE_FORMAT_THREAD_LOCAL = new ThreadLocal() { + @Override + protected SimpleDateFormat initialValue() { + return new SimpleDateFormat(CustomObjectMapper.DATE_PATTERN); + } + }; + + protected static final ThreadLocal DATE_TIME_FORMAT_THREAD_LOCAL = new ThreadLocal() { + @Override + protected SimpleDateFormat initialValue() { + return new SimpleDateFormat(CustomObjectMapper.DATETIME_PATTERN); + } + }; + + protected static final ThreadLocal TIMESTAMP_FORMAT_THREAD_LOCAL = new ThreadLocal() { + @Override + protected SimpleDateFormat initialValue() { + return new SimpleDateFormat(CustomObjectMapper.TIMESTAMP_PATTERN); + } + }; + + public static CustomObjectMapper newInstance() { + return new CustomObjectMapper(); + } + + private static CustomObjectMapper getDefaultInstance() { + if (CustomObjectMapper.defaultInstance == null) { + CustomObjectMapper.defaultInstance = new CustomObjectMapper(); + } + return CustomObjectMapper.defaultInstance; + } + + public T doTransJsonToObject(String json, T object) { + if (object == null) { + throw new IllegalArgumentException("'object' must be not null."); + } + + try { + if (Map.class.isAssignableFrom(object.getClass())) { + throw new IllegalArgumentException("The target object does not support map types."); + } else if (Collection.class.isAssignableFrom(object.getClass())) { + throw new IllegalArgumentException("The target object does not support collection types."); + } else { + JavaType javaType = this.getTypeFactory().constructType(object.getClass()); + return this.readValue(json, javaType); + } + } catch (Exception e) { + throw new RuntimeException("Cannot trans json to object, json=" + json + ", object.getClass()=" + object.getClass(), e); + } + } + + public T doTransJsonToObject(String json, Class clazz) { + if (clazz == null) { + throw new IllegalArgumentException("'clazz' must be not null."); + } + + try { + JavaType javaType = this.getTypeFactory().constructType(clazz); + return this.readValue(json, javaType); + } catch (Exception e) { + throw new RuntimeException("Cannot trans json to object, json=" + json + ", clazz=" + clazz, e); + } + } + + public T doTransJsonToObject(String json, JavaType javaType) { + if (javaType == null) { + throw new IllegalArgumentException("'javaType' must be not null."); + } + + try { + return this.readValue(json, javaType); + } catch (Exception e) { + throw new RuntimeException("Cannot trans json to object, json=" + json + ", javaType=" + javaType, e); + } + } + + public List doTransJsonToObjects(String json, T object) { + if (object == null) { + throw new IllegalArgumentException("'object' must be not null."); + } + + try { + if (Map.class.isAssignableFrom(object.getClass())) { + throw new IllegalArgumentException("The target object does not support map types."); + } else if (Collection.class.isAssignableFrom(object.getClass())) { + throw new IllegalArgumentException("The target object does not support collection types."); + } else { + JavaType elementJavaType = this.getTypeFactory().constructType(object.getClass()); + CollectionType collectionType = CollectionType.construct(List.class, elementJavaType); + return this.>readValue(json, collectionType); + } + } catch (Exception e) { + throw new RuntimeException("Cannot trans json to object list, json=" + json + ", object.getClass()=" + object.getClass(), e); + } + } + + public List doTransJsonToObjects(String json, Class clazz) { + if (clazz == null) { + throw new IllegalArgumentException("'clazz' must be not null."); + } + + try { + JavaType elementJavaType = this.getTypeFactory().constructType(clazz); + CollectionType collectionType = CollectionType.construct(List.class, elementJavaType); + return this.>readValue(json, collectionType); + } catch (Exception e) { + throw new RuntimeException("Cannot trans json to object list, json=" + json + ", clazz=" + clazz, e); + } + } + + public List doTransJsonToObjects(String json, JavaType javaType) { + if (javaType == null) { + throw new IllegalArgumentException("'javaType' must be not null."); + } + + try { + CollectionType collectionType = CollectionType.construct(List.class, javaType); + return this.>readValue(json, collectionType); + } catch (Exception e) { + throw new RuntimeException("Cannot trans json to object list, json=" + json + ", javaType=" + javaType, e); + } + } + + @SuppressWarnings("unchecked") + public Map doTransJsonToMap(String json, Class keyClazz, Class elementClazz) { + if (keyClazz == null) { + throw new IllegalArgumentException("'keyClazz' must be not null."); + } + if (elementClazz == null) { + throw new IllegalArgumentException("'elementClazz' must be not null."); + } + + if (json != null) { + if (json.trim().startsWith(LEFT_BRACKET) && json.trim().endsWith(RIGHT_BRACKET)) { + Map map = new LinkedHashMap(); + List list = this.doTransJsonToObjects(json, keyClazz); + for (K t : list) { + if (keyClazz.equals(elementClazz)) { + map.put(t, (E) t); + } else { + String s = this.doTransObjectToJson(t); + if (String.class.isAssignableFrom(elementClazz)) { + if (s.startsWith("\"") && s.endsWith("\"")) { + s = s.substring(1, s.length() - 1); + } + map.put(t, (E) s); + } else { + E e = this.doTransJsonToObject(s, elementClazz); + map.put(t, e); + } + } + } + return map; + } else if (!json.trim().startsWith(LEFT_SQUARE_BRACKETS) && !json.trim().endsWith(RIGHT_SQUARE_BRACKETS)) { + Map map = new LinkedHashMap(); + String separator = json.contains(";") ? ";" : ","; + boolean isMap = json.contains("="); + String[] array = json.split(separator); + for (String t : array) { + K key = null; + E element = null; + String k = null; + String e = null; + if (isMap) { + k = t.substring(0, json.indexOf("=")); + e = t.substring(k.length() + 1); + } else { + k = t; + e = t; + } + + if (String.class.isAssignableFrom(keyClazz)) { + key = (K) k; + } else { + key = this.doTransJsonToObject(k, keyClazz); + } + if (String.class.isAssignableFrom(elementClazz)) { + element = (E) e; + } else { + element = this.doTransJsonToObject(e, elementClazz); + } + map.put(key, element); + } + return map; + } + } + try { + JavaType mapKeyJavaType = this.getTypeFactory().constructType(keyClazz); + JavaType mapValueJavaType = this.getTypeFactory().constructType(elementClazz); + MapType mapType = MapType.construct(LinkedHashMap.class, mapKeyJavaType, mapValueJavaType); + return this.>readValue(json, mapType); + } catch (Exception e) { + throw new RuntimeException("Cannot trans json to map, json=" + json + ", keyClazz=" + keyClazz + ", elementClazz=" + elementClazz, e); + } + } + + public String doTransObjectToJson(Object object) { + try { + return this.writeValueAsString(object); + } catch (Exception e) { + throw new RuntimeException("Cannot trans object to json, object=" + object, e); + } + } + + public T doTransMapToObject(Map map, T object) { + if (object == null) { + throw new IllegalArgumentException("'object' must be not null."); + } + + try { + String jsonContent = this.writeValueAsString(map); + JavaType javaType = this.getTypeFactory().constructType(object.getClass()); + T entity = this.readValue(jsonContent, javaType); + return entity; + } catch (Exception e) { + throw new RuntimeException("Cannot trans map to object, map=" + map + ", object.getClass()=" + object.getClass(), e); + } + } + + public T doTransMapToObject(Map map, Class clazz) { + if (clazz == null) { + throw new IllegalArgumentException("'clazz' must be not null."); + } + + try { + String jsonContent = this.writeValueAsString(map); + T entity = this.readValue(jsonContent, clazz); + return entity; + } catch (Exception e) { + throw new RuntimeException("Cannot trans map to object, map=" + map + ", clazz=" + clazz, e); + } + } + + public T doTransMapToObject(Map map, JavaType javaType) { + if (javaType == null) { + throw new IllegalArgumentException("'javaType' must be not null."); + } + + try { + String jsonContent = this.writeValueAsString(map); + T entity = this.readValue(jsonContent, javaType); + return entity; + } catch (Exception e) { + throw new RuntimeException("Cannot trans map to object, map=" + map + ", javaType=" + javaType, e); + } + } + + public List doTransMapsToObjects(List> maps, T object) { + if (object == null) { + throw new IllegalArgumentException("'object' must be not null."); + } + + List entities = new ArrayList(); + for (Map map : maps) { + T entity = this.doTransMapToObject(map, object); + entities.add(entity); + } + return entities; + } + + public List doTransMapsToObjects(List> maps, Class clazz) { + if (clazz == null) { + throw new IllegalArgumentException("'clazz' must be not null."); + } + + List objects = new ArrayList(); + for (Map map : maps) { + T object = this.doTransMapToObject(map, clazz); + objects.add(object); + } + return objects; + } + + public List doTransMapsToObjects(List> maps, JavaType javaType) { + if (javaType == null) { + throw new IllegalArgumentException("'javaType' must be not null."); + } + + List objects = new ArrayList(); + for (Map map : maps) { + T object = this.doTransMapToObject(map, javaType); + objects.add(object); + } + return objects; + } + + public Map doTransObjectToMap(Object object) { + try { + String jsonContent = this.writeValueAsString(object); + JavaType mapKeyJavaType = this.getTypeFactory().constructType(String.class); + JavaType mapValueJavaType = this.getTypeFactory().constructType(Object.class); + MapType mapType = MapType.construct(LinkedHashMap.class, mapKeyJavaType, mapValueJavaType); + Map map = this.>readValue(jsonContent, mapType); + return map; + } catch (Exception e) { + throw new RuntimeException("Cannot trans object to map, object=" + object, e); + } + } + + public List> doTransObjectsToMaps(List objects) { + List> maps = new ArrayList>(); + for (Object object : objects) { + Map map = this.doTransObjectToMap(object); + maps.add(map); + } + return maps; + } + + @SuppressWarnings("unchecked") + public List doTransObjects(List objects, String[] properties) { + if (objects == null || objects.size() == 0) { + throw new IllegalArgumentException("'objects' must be not empty."); + } + + List> maps = this.doTransObjectsToMaps(objects); + List> newMaps = new ArrayList>(); + for (Map map : maps) { + Map newMap = new LinkedHashMap(); + for (String property : properties) { + newMap.put(property, map.get(property)); + } + newMaps.add(newMap); + } + return (List) this.doTransMapsToObjects(newMaps, objects.iterator().next().getClass()); + } + + @SuppressWarnings("unchecked") + public T doTransObject(T object, String[] properties) { + if (object == null) { + throw new IllegalArgumentException("'objects' must be not null."); + } + if (properties == null || properties.length == 0) { + throw new IllegalArgumentException("'properties' must be not empty."); + } + + Map map = this.doTransObjectToMap(object); + Map newMap = new LinkedHashMap(); + for (String property : properties) { + newMap.put(property, map.get(property)); + } + return (T) this.doTransMapToObject(newMap, object.getClass()); + } + + public static T transJsonToObject(String json, T object) { + return CustomObjectMapper.getDefaultInstance().doTransJsonToObject(json, object); + } + + public static T transJsonToObject(String json, Class clazz) { + return CustomObjectMapper.getDefaultInstance().doTransJsonToObject(json, clazz); + } + + public static T transJsonToObject(String json, JavaType javaType) { + return CustomObjectMapper.getDefaultInstance().doTransJsonToObject(json, javaType); + } + + public static Map transJsonToMap(String json, Class keyClazz, Class elementClazz) { + return CustomObjectMapper.getDefaultInstance().doTransJsonToMap(json, keyClazz, elementClazz); + } + + public static List transJsonToObjects(String json, T object) { + return CustomObjectMapper.getDefaultInstance().doTransJsonToObjects(json, object); + } + + public static List transJsonToObjects(String json, Class clazz) { + return CustomObjectMapper.getDefaultInstance().doTransJsonToObjects(json, clazz); + } + + public static List transJsonToObjects(String json, JavaType javaType) { + return CustomObjectMapper.getDefaultInstance().doTransJsonToObjects(json, javaType); + } + + public static String transObjectToJson(Object object) { + return CustomObjectMapper.getDefaultInstance().doTransObjectToJson(object); + } + + public static T transMapToObject(Map map, Class clazz) { + return CustomObjectMapper.getDefaultInstance().doTransMapToObject(map, clazz); + } + + public static T transMapToObject(Map map, JavaType javaType) { + return CustomObjectMapper.getDefaultInstance().doTransMapToObject(map, javaType); + } + + public static List transMapsToObjects(List> maps, T object) { + return CustomObjectMapper.getDefaultInstance().doTransMapsToObjects(maps, object); + } + + public static List transMapsToObjects(List> maps, Class clazz) { + return CustomObjectMapper.getDefaultInstance().doTransMapsToObjects(maps, clazz); + } + + public static List transMapsToObjects(List> maps, JavaType javaType) { + return CustomObjectMapper.getDefaultInstance().doTransMapsToObjects(maps, javaType); + } + + public static Map transObjectToMap(Object object) { + return CustomObjectMapper.getDefaultInstance().doTransObjectToMap(object); + } + + public static List> transObjectsToMaps(List objects) { + return CustomObjectMapper.getDefaultInstance().doTransObjectsToMaps(objects); + } + + public static List transObjects(List objects, String[] properties) { + return CustomObjectMapper.getDefaultInstance().doTransObjects(objects, properties); + } + + public static T transObject(T object, String[] properties) { + return CustomObjectMapper.getDefaultInstance().doTransObject(object, properties); + } + + public static String format(String format, Object... values) { + Object[] args = new Object[values.length]; + for (int i = 0; i < values.length; i++) { + if (values[i] == null) { + args[i] = "NULL"; + } else { + Class valClazz = values[i].getClass(); + if (!String.class.isAssignableFrom(valClazz) + && !Boolean.class.isAssignableFrom(valClazz) + && !boolean.class.isAssignableFrom(valClazz) + && !Number.class.isAssignableFrom(valClazz) + && !int.class.isAssignableFrom(valClazz) + && !long.class.isAssignableFrom(valClazz) + && !float.class.isAssignableFrom(valClazz) + && !short.class.isAssignableFrom(valClazz)) { + String valStr = CustomObjectMapper.transObjectToJson(values[i]); + if (valStr.startsWith("\"") && valStr.endsWith("\"")) { + args[i] = valStr.substring(1, valStr.length() - 1); + } + args[i] = valStr; + } else { + args[i] = values[i]; + } + } + } + return String.format(format, args); + } + + @SuppressWarnings("unchecked") + public static T transValue(Object value, Class clazz) { + if (value == null) { + return null; + } + + if (clazz.isAssignableFrom(value.getClass())) { + return (T) value; + } else if (String.class.isAssignableFrom(value.getClass())) { + if (Boolean.class.isAssignableFrom(clazz) || boolean.class.isAssignableFrom(clazz)) { + return CustomObjectMapper.transJsonToObject((String) value, clazz); + } else { + try { + return CustomObjectMapper.transJsonToObject("\"" + value + "\"", clazz); + } catch (Exception e) { + try { + return CustomObjectMapper.transJsonToObject((String) value, clazz); + } catch (Exception ex) { + throw new RuntimeException("Cannot trans value, value=" + value + ", clazz=" + clazz, e); + } + } + } + } else if (String.class.isAssignableFrom(clazz)) { + String json = CustomObjectMapper.transObjectToJson(value); + if (json.startsWith(JOINT_MARK) && json.endsWith(JOINT_MARK)) { + return (T) json.substring(1, json.length() - 1); + } else { + return (T) json; + } + } else { + String json = CustomObjectMapper.transObjectToJson(value); + return CustomObjectMapper.transJsonToObject(json, clazz); + } + } + + @SuppressWarnings({"unchecked", "rawtypes"}) + public static List transValues(Object values, Class clazz) { + if (values == null) { + return Collections.emptyList(); + } + + Collection temps = new ArrayList(); + if (Collection.class.isAssignableFrom(values.getClass())) { + temps.addAll((Collection) values); + } else if (values.getClass().isArray()) { + temps.addAll(Arrays.asList((Object[]) values)); + } else { + temps.add(values); + } + + List newValues = new ArrayList(); + for (Object temp : temps) { + T newValue = CustomObjectMapper.transValue(temp, clazz); + newValues.add(newValue); + } + return newValues; + } + + public static Map transMapValues(Map map, Class valueClazz) { + if (map == null) { + return null; + } + + Map newMap = new LinkedHashMap(); + for (Entry entry : map.entrySet()) { + T newValue = CustomObjectMapper.transValue(entry.getValue(), valueClazz); + newMap.put(entry.getKey(), newValue); + } + return newMap; + } +} \ No newline at end of file diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/EngineTypeEnum.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/EngineTypeEnum.java new file mode 100644 index 00000000..2b26d59e --- /dev/null +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/EngineTypeEnum.java @@ -0,0 +1,30 @@ +package com.webank.wedatasphere.qualitis; + +/** + * @author allenzhou@webank.com + * @date 2022/7/28 11:45 + */ +public enum EngineTypeEnum { + /** + * 1 SHELL ENGINE + * 2 SPARK ENGINE + */ + DEFAULT_ENGINE(1, "shell"), + SPARK_ENGINE(2, "spark"); + + private Integer code; + private String message; + + EngineTypeEnum(Integer code, String message) { + this.code = code; + this.message = message; + } + + public Integer getCode() { + return code; + } + + public String getMessage() { + return message; + } +} diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/bean/DataQualityJob.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/bean/DataQualityJob.java index f1970408..826d8f2b 100644 --- a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/bean/DataQualityJob.java +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/bean/DataQualityJob.java @@ -27,8 +27,13 @@ public class DataQualityJob { private Long taskId; private String user; private String startupParam; + private Boolean engineReuse; + private String engineType; + private Integer resultNum; public DataQualityJob() { + // Initial + engineReuse = Boolean.TRUE; jobCode = new ArrayList<>(); } @@ -64,6 +69,30 @@ public void setStartupParam(String startupParam) { this.startupParam = startupParam; } + public Boolean getEngineReuse() { + return engineReuse; + } + + public void setEngineReuse(Boolean engineReuse) { + this.engineReuse = engineReuse; + } + + public String getEngineType() { + return engineType; + } + + public void setEngineType(String engineType) { + this.engineType = engineType; + } + + public Integer getResultNum() { + return resultNum; + } + + public void setResultNum(Integer resultNum) { + this.resultNum = resultNum; + } + @Override public String toString() { return "DataQualityJob{" + diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/bean/FpsColumnInfo.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/bean/FpsColumnInfo.java new file mode 100644 index 00000000..5adaac04 --- /dev/null +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/bean/FpsColumnInfo.java @@ -0,0 +1,70 @@ +package com.webank.wedatasphere.qualitis.bean; + +import com.fasterxml.jackson.annotation.JsonProperty; + +/** + * @author allenzhou + */ +public class FpsColumnInfo { + @JsonProperty("name") + private String name; + @JsonProperty("index") + private Integer index; + @JsonProperty("comment") + private String comment; + @JsonProperty("type") + private String type; + @JsonProperty("dateFormat") + private String dateFormat; + + public FpsColumnInfo() { + } + + public FpsColumnInfo(String name, Integer index, String comment, String type, String dateFormat) { + this.name = name; + this.index = index; + this.comment = comment; + this.type = type; + this.dateFormat = dateFormat; + } + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } + + public Integer getIndex() { + return index; + } + + public void setIndex(Integer index) { + this.index = index; + } + + public String getComment() { + return comment; + } + + public void setComment(String comment) { + this.comment = comment; + } + + public String getType() { + return type; + } + + public void setType(String type) { + this.type = type; + } + + public String getDateFormat() { + return dateFormat; + } + + public void setDateFormat(String dateFormat) { + this.dateFormat = dateFormat; + } +} diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/DpmConfig.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/DpmConfig.java new file mode 100644 index 00000000..0ac48296 --- /dev/null +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/DpmConfig.java @@ -0,0 +1,67 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.config; + +import org.springframework.beans.factory.annotation.Value; +import org.springframework.context.annotation.Configuration; + +/** + * @author allenzhou + */ +@Configuration +public class DpmConfig { + @Value("${linkis.api.meta_data.dpm_server}") + private String datasourceServer; + @Value("${linkis.api.meta_data.dpm_port}") + private Integer datasourcePort; + @Value("${linkis.api.meta_data.dpm_systemAppId}") + private String datasourceSystemAppId; + @Value("${linkis.api.meta_data.dpm_systemAppKey}") + private String datasourceSystemAppKey; + + public String getDatasourceServer() { + return datasourceServer; + } + + public void setDatasourceServer(String datasourceServer) { + this.datasourceServer = datasourceServer; + } + + public Integer getDatasourcePort() { + return datasourcePort; + } + + public void setDatasourcePort(Integer datasourcePort) { + this.datasourcePort = datasourcePort; + } + + public String getDatasourceSystemAppId() { + return datasourceSystemAppId; + } + + public void setDatasourceSystemAppId(String datasourceSystemAppId) { + this.datasourceSystemAppId = datasourceSystemAppId; + } + + public String getDatasourceSystemAppKey() { + return datasourceSystemAppKey; + } + + public void setDatasourceSystemAppKey(String datasourceSystemAppKey) { + this.datasourceSystemAppKey = datasourceSystemAppKey; + } +} diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/FpsConfig.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/FpsConfig.java new file mode 100644 index 00000000..5242d06d --- /dev/null +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/FpsConfig.java @@ -0,0 +1,162 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.config; + +import org.springframework.beans.factory.annotation.Value; +import org.springframework.context.annotation.Configuration; + +/** + * @author allenzhou + */ +@Configuration +public class FpsConfig { + @Value("${linkis.api.prefix}") + private String prefix; + @Value("${linkis.api.submitJob}") + private String submitJob; + @Value("${linkis.api.submitJobNew}") + private String submitJobNew; + @Value("${linkis.api.status}") + private String status; + @Value("${linkis.spark.application.name}") + private String appName; + @Value("${linkis.fps.hdfs_prefix}") + private String hdfsPrefix; + @Value("${linkis.fps.file_system}") + private String fileSystem; + + @Value("${linkis.spark.application.reparation}") + private Integer reparation; + + @Value("${linkis.lightweight_query}") + private Boolean lightweightQuery; + + @Value("${linkis.fps.application.engine.version}") + private String engineVersion; + + @Value("${linkis.fps.application.name}") + private String fpsApplicationName; + + @Value("${linkis.fps.request.max_retries}") + private int maxRetries = 10; + @Value("${linkis.fps.request.total_wait_duration}") + private int totalWaitDuration = 10 * 1000; + + public int getMaxRetries() { + return maxRetries; + } + + public void setMaxRetries(int maxRetries) { + this.maxRetries = maxRetries; + } + + public int getTotalWaitDuration() { + return totalWaitDuration; + } + + public void setTotalWaitDuration(int totalWaitDuration) { + this.totalWaitDuration = totalWaitDuration; + } + + public String getHdfsPrefix() { + return hdfsPrefix; + } + + public void setHdfsPrefix(String hdfsPrefix) { + this.hdfsPrefix = hdfsPrefix; + } + + public String getFileSystem() { + return fileSystem; + } + + public void setFileSystem(String fileSystem) { + this.fileSystem = fileSystem; + } + + public String getPrefix() { + return prefix; + } + + public void setPrefix(String prefix) { + this.prefix = prefix; + } + + public String getSubmitJob() { + return submitJob; + } + + public void setSubmitJob(String submitJob) { + this.submitJob = submitJob; + } + + public String getSubmitJobNew() { + return submitJobNew; + } + + public void setSubmitJobNew(String submitJobNew) { + this.submitJobNew = submitJobNew; + } + + public String getStatus() { + return status; + } + + public void setStatus(String status) { + this.status = status; + } + + public String getAppName() { + return appName; + } + + public void setAppName(String appName) { + this.appName = appName; + } + + public Integer getReparation() { + return reparation; + } + + public void setReparation(Integer reparation) { + this.reparation = reparation; + } + + public Boolean getLightweightQuery() { + return lightweightQuery; + } + + public void setLightweightQuery(Boolean lightweightQuery) { + this.lightweightQuery = lightweightQuery; + } + + public String getEngineVersion() { + return engineVersion; + } + + public void setEngineVersion(String engineVersion) { + this.engineVersion = engineVersion; + } + + public String getFpsApplicationName() { + return fpsApplicationName; + } + + public void setFpsApplicationName(String fpsApplicationName) { + this.fpsApplicationName = fpsApplicationName; + } +} diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/TaskDataSourceConfig.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/TaskDataSourceConfig.java index 0b61d0d6..89691917 100644 --- a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/TaskDataSourceConfig.java +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/TaskDataSourceConfig.java @@ -8,8 +8,42 @@ */ @Configuration public class TaskDataSourceConfig { + @Value("${task.persistent.private_key}") + private String privateKey; + @Value("${task.persistent.username}") + private String username; @Value("${task.persistent.password}") private String password; + @Value("${task.persistent.address}") + private String mysqlAddress; + @Value("${task.persistent.mysqlsec_open}") + private Boolean mysqlsecOpen; + @Value("${task.persistent.mysqlsec}") + private String mysqlsec; + @Value("${task.persistent.hive_sort_udf_open}") + private Boolean hiveSortUdfOpen; + @Value("${task.persistent.hive_sort_udf}") + private String hiveSortUdf; + @Value("${task.persistent.hive_sort_udf_class_path}") + private String hiveSortUdfClassPath; + @Value("${task.persistent.hive_sort_udf_lib_path}") + private String hiveSortUdfLibPath; + + public String getPrivateKey() { + return privateKey; + } + + public void setPrivateKey(String privateKey) { + this.privateKey = privateKey; + } + + public String getUsername() { + return username; + } + + public void setUsername(String username) { + this.username = username; + } public String getPassword() { return password; @@ -18,4 +52,52 @@ public String getPassword() { public void setPassword(String password) { this.password = password; } + + public Boolean getMysqlsecOpen() { + return mysqlsecOpen; + } + + public void setMysqlsecOpen(Boolean mysqlsecOpen) { + this.mysqlsecOpen = mysqlsecOpen; + } + + public String getMysqlsec() { + return mysqlsec; + } + + public void setMysqlsec(String mysqlsec) { + this.mysqlsec = mysqlsec; + } + + public Boolean getHiveSortUdfOpen() { + return hiveSortUdfOpen; + } + + public void setHiveSortUdfOpen(Boolean hiveSortUdfOpen) { + this.hiveSortUdfOpen = hiveSortUdfOpen; + } + + public String getHiveSortUdf() { + return hiveSortUdf; + } + + public void setHiveSortUdf(String hiveSortUdf) { + this.hiveSortUdf = hiveSortUdf; + } + + public String getHiveSortUdfClassPath() { + return hiveSortUdfClassPath; + } + + public void setHiveSortUdfClassPath(String hiveSortUdfClassPath) { + this.hiveSortUdfClassPath = hiveSortUdfClassPath; + } + + public String getHiveSortUdfLibPath() { + return hiveSortUdfLibPath; + } + + public void setHiveSortUdfLibPath(String hiveSortUdfLibPath) { + this.hiveSortUdfLibPath = hiveSortUdfLibPath; + } } diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/constant/OptTypeEnum.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/constant/OptTypeEnum.java new file mode 100644 index 00000000..858f9225 --- /dev/null +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/constant/OptTypeEnum.java @@ -0,0 +1,49 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.constant; + +/** + * @author allenzhou + */ +public enum OptTypeEnum { + /** + * Opt type + */ + CHECK_DF(1, "checkDF"), + STATISTIC_DF(2, "statisticDF"), + SCHEMAS(3, "schemas"), + NEW_SCHEMAS(4, "newSchemas"), + ORIGINAL_STATISTIC_DF(5, "originalStatisticDF"), + LEFT_JOIN_STATISTIC_DF(6, "leftOriginalStatisticDF"), + RIGHT_JOIN_STATISTIC_DF(7, "rightOriginalStatisticDF"); + + private Integer code; + private String message; + + OptTypeEnum(Integer code, String message) { + this.code = code; + this.message = message; + } + + public Integer getCode() { + return code; + } + + public String getMessage() { + return message; + } +} diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/converter/AbstractTemplateConverter.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/converter/AbstractTemplateConverter.java index db57baa1..27e428e5 100644 --- a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/converter/AbstractTemplateConverter.java +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/converter/AbstractTemplateConverter.java @@ -18,19 +18,15 @@ import com.webank.wedatasphere.qualitis.bean.DataQualityJob; import com.webank.wedatasphere.qualitis.bean.DataQualityTask; -import com.webank.wedatasphere.qualitis.exception.ConvertException; -import com.webank.wedatasphere.qualitis.exception.DataQualityTaskException; -import com.webank.wedatasphere.qualitis.exception.RuleVariableNotFoundException; -import com.webank.wedatasphere.qualitis.exception.RuleVariableNotSupportException; -import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; -import java.io.IOException; import java.util.Date; +import java.util.List; import java.util.Map; /** * @author howeye */ public abstract class AbstractTemplateConverter { + /** * Convert Task into code that can be executed. * @param dataQualityTask @@ -40,14 +36,15 @@ public abstract class AbstractTemplateConverter { * @param runDate * @param clusterType * @param dataSourceMysqlConnect + * @param user + * @param leftCols + * @param rightCols + * @param comelexCols + * @param createUser * @return - * @throws ConvertException - * @throws DataQualityTaskException - * @throws RuleVariableNotSupportException - * @throws RuleVariableNotFoundException - * @throws IOException - * @throws UnExpectedRequestException + * @throws Exception */ public abstract DataQualityJob convert(DataQualityTask dataQualityTask, Date date, String setFlag, Map execParams, String runDate, - String clusterType, Map dataSourceMysqlConnect) throws ConvertException, DataQualityTaskException, RuleVariableNotSupportException, RuleVariableNotFoundException, IOException, UnExpectedRequestException; + String clusterType, Map>> dataSourceMysqlConnect, String user, List leftCols, List rightCols, + List comelexCols,String createUser) throws Exception; } diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/converter/SqlTemplateConverter.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/converter/SqlTemplateConverter.java index ec3fa6c7..ff336163 100644 --- a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/converter/SqlTemplateConverter.java +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/converter/SqlTemplateConverter.java @@ -16,72 +16,72 @@ package com.webank.wedatasphere.qualitis.converter; +import com.webank.wedatasphere.qualitis.EngineTypeEnum; +import com.webank.wedatasphere.qualitis.LocalConfig; import com.webank.wedatasphere.qualitis.bean.DataQualityJob; import com.webank.wedatasphere.qualitis.bean.DataQualityTask; +import com.webank.wedatasphere.qualitis.bean.FpsColumnInfo; import com.webank.wedatasphere.qualitis.bean.RuleTaskDetail; -import com.webank.wedatasphere.qualitis.config.OptimizationConfig; +import com.webank.wedatasphere.qualitis.config.DpmConfig; +import com.webank.wedatasphere.qualitis.config.FpsConfig; +import com.webank.wedatasphere.qualitis.config.TaskDataSourceConfig; +import com.webank.wedatasphere.qualitis.constant.OptTypeEnum; import com.webank.wedatasphere.qualitis.constant.SpecCharEnum; +import com.webank.wedatasphere.qualitis.constants.QualitisConstants; import com.webank.wedatasphere.qualitis.entity.RuleMetric; -import com.webank.wedatasphere.qualitis.exception.ConvertException; -import com.webank.wedatasphere.qualitis.exception.DataQualityTaskException; -import com.webank.wedatasphere.qualitis.exception.RuleVariableNotFoundException; -import com.webank.wedatasphere.qualitis.exception.RuleVariableNotSupportException; -import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; +import com.webank.wedatasphere.qualitis.exception.*; +import com.webank.wedatasphere.qualitis.metadata.client.DataStandardClient; import com.webank.wedatasphere.qualitis.metadata.constant.RuleConstraintEnum; -import com.webank.wedatasphere.qualitis.rule.constant.InputActionStepEnum; -import com.webank.wedatasphere.qualitis.rule.constant.RuleTemplateTypeEnum; -import com.webank.wedatasphere.qualitis.rule.constant.TemplateInputTypeEnum; -import com.webank.wedatasphere.qualitis.rule.entity.AlarmConfig; -import com.webank.wedatasphere.qualitis.rule.entity.Rule; -import com.webank.wedatasphere.qualitis.rule.entity.RuleDataSource; -import com.webank.wedatasphere.qualitis.rule.entity.RuleDataSourceMapping; -import com.webank.wedatasphere.qualitis.rule.entity.RuleVariable; -import com.webank.wedatasphere.qualitis.rule.entity.Template; -import com.webank.wedatasphere.qualitis.rule.entity.TemplateMidTableInputMeta; -import com.webank.wedatasphere.qualitis.rule.entity.TemplateStatisticsInputMeta; +import com.webank.wedatasphere.qualitis.metadata.exception.MetaDataAcquireFailedException; +import com.webank.wedatasphere.qualitis.rule.constant.*; +import com.webank.wedatasphere.qualitis.rule.entity.*; import com.webank.wedatasphere.qualitis.translator.AbstractTranslator; +import com.webank.wedatasphere.qualitis.util.CryptoUtils; import com.webank.wedatasphere.qualitis.util.DateExprReplaceUtil; -import java.io.IOException; -import java.text.SimpleDateFormat; -import java.util.ArrayList; -import java.util.Arrays; -import java.util.Date; -import java.util.HashMap; -import java.util.HashSet; -import java.util.List; -import java.util.Map; -import java.util.Set; -import java.util.regex.Matcher; -import java.util.regex.Pattern; -import java.util.stream.Collectors; +import com.webank.wedatasphere.qualitis.util.DateUtils; +import com.webank.wedatasphere.qualitis.util.QualitisCollectionUtils; +import com.webank.wedatasphere.qualitis.util.map.CustomObjectMapper; import org.apache.commons.collections.CollectionUtils; import org.apache.commons.lang3.StringUtils; import org.apache.logging.log4j.util.Strings; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.beans.factory.annotation.Value; import org.springframework.stereotype.Component; +import java.io.IOException; +import java.text.SimpleDateFormat; +import java.util.*; +import java.util.regex.Matcher; +import java.util.regex.Pattern; +import java.util.stream.Collectors; + /** * SQL Template Converter, can convert task into sql code * example: * val tmp1 = spark.sql("select * from bdp_test_ods_mask.asd where (fdgdfg) and (new_clum_mask is null)"); * tmp1.write.saveAsTable("qualitishduser05_tmp_safe.mid_application_26_20190117094607_649214"); + * * @author howeye */ @Component public class SqlTemplateConverter extends AbstractTemplateConverter { + @Autowired + private TaskDataSourceConfig taskDataSourceConfig; @Autowired private AbstractTranslator abstractTranslator; @Autowired - private OptimizationConfig optimizationConfig; + private LocalConfig localConfig; + @Autowired + private DpmConfig dpmConfig; + @Autowired + private FpsConfig fpsConfig; - /** - * For 2149 template mid input meta special solve. - */ - private static final String EN_LINE_PRIMARY_REPEAT = "Field Replace Null Concat"; - private static final String CN_LINE_PRIMARY_REPEAT = "替换空字段拼接"; - private static final String MESSAGE_LINE_PRIMARY_REPEAT = "{&FIELD_REPLACE_NULL_CONCAT}"; + @Autowired + private DataStandardClient dataStandardClient; + @Value("${linkis.sql.communalTableName:common_table}") + private String commonTableName; private static final Pattern PLACEHOLDER_PATTERN = Pattern.compile(".*\\$\\{(.*)}.*"); private static final Pattern AGGREGATE_FUNC_PATTERN = Pattern.compile("[a-zA-Z]+\\([0-9a-zA-Z_]+\\)"); @@ -99,85 +99,88 @@ public class SqlTemplateConverter extends AbstractTemplateConverter { * Multi table solve. */ private static final Long MUL_SOURCE_ACCURACY_TEMPLATE_ID = 17L; - private static final Long MUL_SOURCE_COMMON_TEMPLATE_ID = 19L; - + private static final Long MUL_SOURCE_FULL_TEMPLATE_ID = 20L; /** * Dpm properties. */ private static final String DPM = "dpm"; - /** - * Cluster type end with this, it is new links version. - */ - private static final String LINKIS_ONE_VERSION = "1.0"; - private static final String SPARK_SQL_TEMPLATE = "val " + VARIABLE_NAME_PLACEHOLDER + " = spark.sql(\"" + SPARK_SQL_TEMPLATE_PLACEHOLDER + "\");"; - private static final String SPARK_MYSQL_TEMPLATE = "val " + VARIABLE_NAME_PLACEHOLDER + " = spark.read.format(\"jdbc\").option(\"driver\",\"${JDBC_DRIVER}\").option(\"url\",\"jdbc:mysql://${MYSQL_IP}:${MYSQL_PORT}/\").option(\"user\",\"${MYSQL_USER}\").option(\"password\",\"${MYSQL_PASSWORD}\").option(\"query\",\"${SQL}\").load();"; + private static final String SPARK_SQL_TEMPLATE = "var " + VARIABLE_NAME_PLACEHOLDER + " = spark.sql(\"" + SPARK_SQL_TEMPLATE_PLACEHOLDER + "\")"; + private static final String SPARK_MYSQL_TEMPLATE = "val " + VARIABLE_NAME_PLACEHOLDER + " = spark.read.format(\"jdbc\").option(\"driver\",\"${JDBC_DRIVER}\").option(\"url\",\"jdbc:mysql://${MYSQL_IP}:${MYSQL_PORT}/\").option(\"user\",\"${MYSQL_USER}\").option(\"password\",\"${MYSQL_PASSWORD}\").option(\"query\",\"${SQL}\").load()"; private static final String IF_EXIST = "if (spark.catalog.tableExists(\"" + SAVE_MID_TABLE_NAME_PLACEHOLDER + "\")) {"; private static final String ELSE_EXIST = "} else {"; private static final String END_EXIST = "}"; private static final String SAVE_MID_TABLE_SENTENCE_TEMPLATE_CREATE = VARIABLE_NAME_PLACEHOLDER + ".withColumn(\"qualitis_partition_key\", lit(\"${QUALITIS_PARTITION_KEY}\"))" - + ".write.mode(\"append\").partitionBy(\"qualitis_partition_key\").format(\"hive\").saveAsTable(\"" + SAVE_MID_TABLE_NAME_PLACEHOLDER + "\");"; + + ".write.mode(\"append\").partitionBy(\"qualitis_partition_key\").format(\"hive\").saveAsTable(\"" + SAVE_MID_TABLE_NAME_PLACEHOLDER + "\")"; private static final String SAVE_MID_TABLE_SENTENCE_TEMPLATE_INSERT_OVERWRITE_PARTITION = VARIABLE_NAME_PLACEHOLDER + ".withColumn(\"qualitis_partition_key\", lit(\"${QUALITIS_PARTITION_KEY}\"))" - + ".write.mode(\"overwrite\").insertInto(\"" + SAVE_MID_TABLE_NAME_PLACEHOLDER + "\");"; + + ".write.mode(\"overwrite\").insertInto(\"" + SAVE_MID_TABLE_NAME_PLACEHOLDER + "\")"; + + private static final String SAVE_MID_TABLE_SENTENCE_TEMPLATE_CREATE_WITH_ENV = VARIABLE_NAME_PLACEHOLDER + ".withColumn(\"qualitis_partition_key\", lit(\"${QUALITIS_PARTITION_KEY}\"))" + + ".withColumn(\"qualitis_partition_key_env\", lit(\"${QUALITIS_PARTITION_KEY_ENV}\"))" + + ".write.mode(\"append\").partitionBy(\"qualitis_partition_key\", \"qualitis_partition_key_env\").format(\"hive\").saveAsTable(\"" + SAVE_MID_TABLE_NAME_PLACEHOLDER + "\");"; + private static final String SAVE_MID_TABLE_SENTENCE_TEMPLATE_INSERT_OVERWRITE_PARTITION_WITH_ENV = VARIABLE_NAME_PLACEHOLDER + ".withColumn(\"qualitis_partition_key\", lit(\"${QUALITIS_PARTITION_KEY}\"))" + + ".withColumn(\"qualitis_partition_key_env\", lit(\"${QUALITIS_PARTITION_KEY_ENV}\"))" + + ".write.mode(\"overwrite\").insertInto(\"" + SAVE_MID_TABLE_NAME_PLACEHOLDER + "\")"; - private static final String FPS_SOURCE = "val {TMP_SOURCE} = \"\"\"{\"path\":\"/apps-data/hadoop/{CLUSTER_TYPE}/{USER}/fps/{FPS_TALBE}{FPS_TYPE}\",\"pathType\":\"hdfs\",\"encoding\":\"utf-8\",\"fieldDelimiter\":\"\",\"hasHeader\":{FILE_HEADER},\"sheet\":\"{SHEET_NAME}\",\"quote\":\"\",\"escapeQuotes\":false}\"\"\""; + private static final String FPS_SOURCE = "val {TMP_SOURCE} = \"\"\"{\"path\":\"/apps-data/{USER}/fps{DATA_NOW}/{FPS_TALBE}{FPS_TYPE}\",\"pathType\":\"hdfs\",\"encoding\":\"utf-8\",\"fieldDelimiter\":\"\",\"hasHeader\":{FILE_HEADER},\"sheet\":\"{SHEET_NAME}\",\"quote\":\"\",\"escapeQuotes\":false}\"\"\""; private static final String FPS_DESTINATION = "val {TMP_DESTINATION} = \"\"\"{\"database\":\"{FPS_DB}\",\"tableName\":\"{FPS_TALBE}\",\"importData\":false,\"isPartition\":false,\"partition\":\"\",\"partitionValue\":\"\",\"isOverwrite\":true,\"columns\":{COLUMN_LIST}}\"\"\""; - private static final String FPS_IMPORT = "com.webank.wedatasphere.linkis.engine.imexport.LoadData.loadDataToTable(spark,{TMP_SOURCE},{TMP_DESTINATION})"; + private static final String FPS_IMPORT_PREFIX = "org.apache.linkis.engineplugin.spark.imexport.LoadData.loadDataToTable"; + private static final String FPS_IMPORT = FPS_IMPORT_PREFIX + "(spark,{TMP_SOURCE},{TMP_DESTINATION})"; private static final String FPS_DROP_TABLE = "spark.sql(\"drop table {FPS_DB}.{FPS_TALBE}\")"; private static final String FPS_TO_HIVE_WITH_HEADER = "val {DF} =spark.read.option(\"header\", \"true\").option(\"delimiter\", \"{DELIMITER}\").option(\"inferSchema\", \"true\").csv(\"{HDFS_PREFIX}{FPS_FILE_PATH}\")"; - private static final String FPS_FILE_PATH = "/apps-data/hadoop/{CLUSTER_TYPE}/{USER}/fps/"; + private static final String FPS_FILE_PATH = "/apps-data/{USER}/fps{DATA_NOW}/"; private static final String IMPORT_SCHEMA = "import org.apache.spark.sql.types._"; private static final String CONSTRUCT_SCHEMA = "val {SCHEMA} = new StructType()"; private static final String CONSTRUCT_FIELD = ".add(\"{FIELD_NAME}\", {FIELD_TYPE}, true)"; private static final String FPS_TO_HIVE_WITH_SCHEMA = "val {DF} = spark.read.option(\"delimiter\", \"{DELIMITER}\").schema({SCHEMA}).csv(\"{HDFS_PREFIX}{FPS_FILE_PATH}\")"; private static final String DF_REGISTER = "{DF}.registerTempTable(\"{FILE_NAME}\")"; - /** - * Common static field. - */ - private static final String AND = "and"; - - private static final Map FILE_TYPE_SUFFIX = new HashMap(4){{ - put(".txt","text"); - put(".csv", "csv"); - put(".xlsx", "excel"); - put(".xls", "excel"); - }}; - - private static final Map JDBC_DRIVER = new HashMap(4){{ - put("mysql","com.mysql.jdbc.Driver"); - put("tdsql", "com.mysql.jdbc.Driver"); - put("oracle", "oracle.jdbc.driver.OracleDriver"); - put("sqlserver", "com.microsoft.jdbc.sqlserver.SQLServerDriver"); - }}; - - private static final Map DATE_FORMAT = new HashMap(4){{ - put("1","yyyyMMdd"); - put("2", "yyyy-MM-dd"); - put("3", "yyyy.MM.dd"); - put("4", "yyyy/MM/dd"); - }}; - - private static final Map FIELD_TYPE = new HashMap(12){{ - put("tinyint","ByteType"); - put("smallint", "ShortType"); - put("int","IntegerType"); - put("bigint", "LongType"); - put("double", "DoubleType"); - put("float", "FloatType"); - put("decimal", "DecimalType(38,24)"); - put("string", "StringType"); - put("char", "StringType"); - put("varchar", "StringType"); - put("timestamp", "TimestampType"); - put("date", "DateType"); - }}; + private static final Map FILE_TYPE_SUFFIX = new HashMap(4); + + private static final Map JDBC_DRIVER = new HashMap(4); + + private static final Map DATE_FORMAT = new HashMap(4); + + private static final Map FIELD_TYPE = new HashMap(12); + + static { + FILE_TYPE_SUFFIX.put(".txt", "text"); + FILE_TYPE_SUFFIX.put(".csv", "csv"); + FILE_TYPE_SUFFIX.put(".xlsx", "excel"); + FILE_TYPE_SUFFIX.put(".xls", "excel"); + + JDBC_DRIVER.put("mysql", "com.mysql.jdbc.Driver"); + JDBC_DRIVER.put("tdsql", "com.mysql.jdbc.Driver"); + JDBC_DRIVER.put("oracle", "oracle.jdbc.driver.OracleDriver"); + JDBC_DRIVER.put("sqlserver", "com.microsoft.jdbc.sqlserver.SQLServerDriver"); + + DATE_FORMAT.put("1", "yyyyMMdd"); + DATE_FORMAT.put("2", "yyyy-MM-dd"); + DATE_FORMAT.put("3", "yyyy.MM.dd"); + DATE_FORMAT.put("4", "yyyy/MM/dd"); + + FIELD_TYPE.put("tinyint", "ByteType"); + FIELD_TYPE.put("smallint", "ShortType"); + FIELD_TYPE.put("int", "IntegerType"); + FIELD_TYPE.put("bigint", "LongType"); + FIELD_TYPE.put("double", "DoubleType"); + FIELD_TYPE.put("float", "FloatType"); + FIELD_TYPE.put("decimal", "DecimalType(38,24)"); + FIELD_TYPE.put("string", "StringType"); + FIELD_TYPE.put("char", "StringType"); + FIELD_TYPE.put("varchar", "StringType"); + FIELD_TYPE.put("timestamp", "TimestampType"); + FIELD_TYPE.put("date", "DateType"); + FIELD_TYPE.put("boolean", "BooleanType"); + + } private static final Logger LOGGER = LoggerFactory.getLogger(SqlTemplateConverter.class); /** * Convert task into scala code + * * @param dataQualityTask * @param date * @param setFlag @@ -185,6 +188,11 @@ public class SqlTemplateConverter extends AbstractTemplateConverter { * @param runDate * @param clusterType * @param dataSourceMysqlConnect + * @param user + * @param leftCols + * @param rightCols + * @param complexCols + * @param createUser * @return * @throws ConvertException * @throws DataQualityTaskException @@ -193,17 +201,15 @@ public class SqlTemplateConverter extends AbstractTemplateConverter { */ @Override public DataQualityJob convert(DataQualityTask dataQualityTask, Date date, String setFlag, Map execParams, String runDate - , String clusterType, Map dataSourceMysqlConnect) - throws ConvertException, DataQualityTaskException, RuleVariableNotSupportException, RuleVariableNotFoundException, IOException, UnExpectedRequestException { + , String clusterType, Map>> dataSourceMysqlConnect, String user, List leftCols, List rightCols, List complexCols, String createUser) throws Exception { + boolean withSpark = CollectionUtils.isNotEmpty(complexCols) && Boolean.FALSE.equals(taskDataSourceConfig.getHiveSortUdfOpen()); LOGGER.info("Start to convert template to actual code, task: " + dataQualityTask); if (null == dataQualityTask || dataQualityTask.getRuleTaskDetails().isEmpty()) { throw new DataQualityTaskException("Task can not be null or empty"); } DataQualityJob job = new DataQualityJob(); - List initSentence = abstractTranslator.getInitSentence(); - job.getJobCode().addAll(initSentence); - LOGGER.info("Succeed to get init code. codes: " + initSentence); + if (StringUtils.isNotBlank(setFlag)) { LOGGER.info("Start to solve with set flag. Spark set conf string: {}", setFlag); String[] setStrs = setFlag.split(SpecCharEnum.DIVIDER.getValue()); @@ -212,23 +218,400 @@ public DataQualityJob convert(DataQualityTask dataQualityTask, Date date, String } LOGGER.info("Finish to solve with set flag."); } + String engineType = EngineTypeEnum.DEFAULT_ENGINE.getMessage(); + String startupParam = dataQualityTask.getStartupParam(); + String queueName = ""; + boolean engineReUse = true; + boolean midTableReUse = true; + boolean unionAllForSaveResult = false; + if (StringUtils.isNotBlank(startupParam)) { + String[] startupParams = startupParam.split(SpecCharEnum.DIVIDER.getValue()); + + for (String param : startupParams) { + if (StringUtils.isEmpty(param)) { + continue; + } + String[] paramStrs = param.split("="); + if (paramStrs.length < 2) { + continue; + } + String key = paramStrs[0]; + String value = paramStrs[1]; + if ("engine_reuse".equals(key)) { + if ("true".equals(value)) { + engineReUse = true; + startupParam = startupParam.replace("engine_reuse=true", ""); + } else { + engineReUse = false; + startupParam = startupParam.replace("engine_reuse=false", ""); + } + } + if ("mid_table_reuse".equals(key)) { + if ("true".equals(value)) { + midTableReUse = true; + startupParam = startupParam.replace("mid_table_reuse=true", ""); + } else { + midTableReUse = false; + startupParam = startupParam.replace("mid_table_reuse=false", ""); + } + } + if ("union_all_save".equals(key)) { + if ("true".equals(value)) { + unionAllForSaveResult = true; + startupParam = startupParam.replace("union_all_save=true", ""); + } else { + unionAllForSaveResult = false; + startupParam = startupParam.replace("union_all_save=false", ""); + } + } + + if ("qualitis.linkis.engineType".equals(key) && !EngineTypeEnum.DEFAULT_ENGINE.getMessage().equals(value)) { + engineType = value; + } + + if ("wds.linkis.rm.yarnqueue".equals(key)) { + queueName = value; + } + } + } + if (execParams.keySet().contains(QualitisConstants.QUALITIS_ENGINE_REUSE) && Boolean.FALSE.equals(Boolean.parseBoolean(execParams.get(QualitisConstants.QUALITIS_ENGINE_REUSE)))) { + engineReUse = false; + } + if (execParams.keySet().contains(QualitisConstants.QUALITIS_MID_TABLE_REUSE) && Boolean.FALSE.equals(Boolean.parseBoolean(execParams.get(QualitisConstants.QUALITIS_MID_TABLE_REUSE)))) { + midTableReUse = false; + } + if (execParams.keySet().contains(QualitisConstants.QUALITIS_UNION_ALL_SAVE) && Boolean.TRUE.equals(Boolean.parseBoolean(execParams.get(QualitisConstants.QUALITIS_UNION_ALL_SAVE)))) { + unionAllForSaveResult = true; + } + if (execParams.keySet().contains(QualitisConstants.QUALITIS_ENGINE_TYPE) && !EngineTypeEnum.DEFAULT_ENGINE.getMessage().equals(execParams.get(QualitisConstants.QUALITIS_ENGINE_TYPE))) { + engineType = EngineTypeEnum.SPARK_ENGINE.getMessage(); + } + List initSentence = abstractTranslator.getInitSentence(); + job.getJobCode().addAll(initSentence); + + List envNames = new ArrayList<>(); + boolean shareConnect = CollectionUtils.isNotEmpty(dataQualityTask.getConnectShare()); + List communalSentence = getCommunalSentence(dataQualityTask, envNames); + job.getJobCode().addAll(communalSentence); int count = 0; for (RuleTaskDetail ruleTaskDetail : dataQualityTask.getRuleTaskDetails()) { - count++; - List codes = generateSparkSqlByTask(ruleTaskDetail.getRule(), date, dataQualityTask.getApplicationId(), ruleTaskDetail.getMidTableName() - , dataQualityTask.getCreateTime(), new StringBuffer(dataQualityTask.getPartition()), execParams, count, runDate, dataSourceMysqlConnect); + if (Boolean.TRUE.equals(ruleTaskDetail.getRule().getUnionAll())) { + unionAllForSaveResult = true; + } + // Get current rule left cols and right cols. + List currentRuleLeftCols = new ArrayList<>(); + List currentRuleRightCols = new ArrayList<>(); + if (CollectionUtils.isNotEmpty(leftCols) && CollectionUtils.isNotEmpty(rightCols)) { + currentRuleLeftCols = leftCols.stream().filter(col -> col.startsWith(ruleTaskDetail.getRule().getId() + SpecCharEnum.MINUS.getValue())).map(col -> col.replace(ruleTaskDetail.getRule().getId() + SpecCharEnum.MINUS.getValue(), "")).collect( + Collectors.toList()); + currentRuleRightCols = leftCols.stream().filter(col -> col.startsWith(ruleTaskDetail.getRule().getId() + SpecCharEnum.MINUS.getValue())).map(col -> col.replace(ruleTaskDetail.getRule().getId() + SpecCharEnum.MINUS.getValue(), "")).collect( + Collectors.toList()); + } + count += 100; + // Fps register dataframe. + List fpsCodes = generateTempHiveTable(ruleTaskDetail.getRule(), dataQualityTask.getUser()); + + job.getJobCode().addAll(fpsCodes); + // Handle hive engine task depend on startup param(qualitis.linkis.engineType=shell,spark(default)) + if (CollectionUtils.isEmpty(fpsCodes) && !withSpark && taskDataSourceConfig.getMysqlsecOpen() && EngineTypeEnum.DEFAULT_ENGINE.getMessage().equals(engineType) && MUL_SOURCE_FULL_TEMPLATE_ID.equals(ruleTaskDetail.getRule().getTemplate().getId()) && CollectionUtils.isEmpty(dataSourceMysqlConnect.keySet())) { + job.getJobCode().clear(); + // Hql + List codes = generateShellSqlByTask(ruleTaskDetail.getRule(), date, dataQualityTask.getApplicationId(), dataQualityTask.getCreateTime(), new StringBuilder(dataQualityTask.getPartition()), count, runDate, currentRuleLeftCols, currentRuleRightCols, complexCols, queueName, createUser); + job.setEngineType(EngineTypeEnum.DEFAULT_ENGINE.getMessage()); + job.getJobCode().addAll(codes); + continue; + } + + List codes = generateSparkSqlByTask(job, ruleTaskDetail.getRule(), date, dataQualityTask.getApplicationId(), ruleTaskDetail.getMidTableName() + , dataQualityTask.getCreateTime(), new StringBuilder(dataQualityTask.getPartition()), execParams, runDate, dataSourceMysqlConnect, user, midTableReUse + , unionAllForSaveResult, currentRuleLeftCols, currentRuleRightCols, complexCols, createUser, shareConnect, dataQualityTask.getDbShare() + SpecCharEnum.PERIOD_NO_ESCAPE.getValue() + dataQualityTask.getTableShare()); + job.setEngineType(EngineTypeEnum.SPARK_ENGINE.getMessage()); job.getJobCode().addAll(codes); - LOGGER.info("Succeed to convert rule into code. rule_id: {}, rul_name: {}, codes: {}", ruleTaskDetail.getRule().getId(), ruleTaskDetail.getRule().getName(), codes); + + if (fpsCodes.contains(FPS_IMPORT_PREFIX)) { + job.getJobCode().addAll(dropHiveTable(ruleTaskDetail.getRule())); + } + LOGGER.info("Succeed to convert rule into code. rule id: {}, rule name: {}, codes: {}", ruleTaskDetail.getRule().getId(), ruleTaskDetail.getRule().getName(), codes); + } + + if (CollectionUtils.isNotEmpty(communalSentence)) { + if (CollectionUtils.isNotEmpty(envNames)) { + for (String envName : envNames) { + job.getJobCode().add("spark.catalog.uncacheTable(\"" + commonTableName + SpecCharEnum.BOTTOM_BAR.getValue() + envName + "\")"); + } + } else { + job.getJobCode().add("spark.catalog.uncacheTable(\"" + commonTableName + "\")"); + } } LOGGER.info("Succeed to convert all rule into actual scala code."); job.setTaskId(dataQualityTask.getTaskId()); - job.setStartupParam(dataQualityTask.getStartupParam()); + job.setStartupParam(startupParam); + job.setEngineReuse(engineReUse); return job; } + private List getCommunalSentence(DataQualityTask dataQualityTask, List envNames) throws UnExpectedRequestException { + List sqlList = new ArrayList<>(); + if (StringUtils.isEmpty(dataQualityTask.getDbShare()) || StringUtils.isEmpty(dataQualityTask.getTableShare())) { + return sqlList; + } + List columnList = new ArrayList<>(); + + String filterPart = dataQualityTask.getFilterShare(); + String fromPart = dataQualityTask.getDbShare() + SpecCharEnum.PERIOD_NO_ESCAPE.getValue() + dataQualityTask.getTableShare(); + + String selectPart = "*"; + if (StringUtils.isNotEmpty(dataQualityTask.getColumnShare())) { + String[] columns = dataQualityTask.getColumnShare().split(SpecCharEnum.VERTICAL_BAR.getValue()); + for (String col : columns) { + String colName = col.split(SpecCharEnum.COLON.getValue())[0]; + if (! columnList.contains(colName)) { + columnList.add(colName); + } + } + } + + if (CollectionUtils.isNotEmpty(columnList)) { + selectPart = String.join(SpecCharEnum.COMMA.getValue(), columnList); + } + if (CollectionUtils.isNotEmpty(dataQualityTask.getConnectShare())) { + List> connParamMaps = decryptMysqlInfo(dataQualityTask.getConnectShare()); + for (Map connParams : connParamMaps) { + String envName = (String) connParams.get("envName"); + if (StringUtils.isEmpty(envName)) { + continue; + } + envNames.add(envName); + String host = (String) connParams.get("host"); + String port = (String) connParams.get("port"); + String pwd = (String) connParams.get("password"); + String user = (String) connParams.get("username"); + String dataType = (String) connParams.get("dataType"); + String sql = "select " + selectPart + " from " + fromPart + " where " + filterPart; + String str = SPARK_MYSQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, sql) + .replace("${JDBC_DRIVER}", JDBC_DRIVER.get(dataType)) + .replace("${MYSQL_IP}", host) + .replace("${MYSQL_PORT}", port) + .replace("${MYSQL_USER}", user) + .replace("${MYSQL_PASSWORD}", pwd); + sqlList.add(str.replace(VARIABLE_NAME_PLACEHOLDER, "communalDf_" + envName)); + sqlList.add("communalDf_" + envName + ".cache()"); + sqlList.add("communalDf_" + envName + ".createOrReplaceTempView(\"" + commonTableName + "_" + envName + "\")"); + sqlList.add("spark.catalog.cacheTable(\"" + commonTableName + "_" + envName + "\")"); + } + } else { + sqlList.add("val communalDf = spark.sql(\"select " + selectPart + " from " + fromPart + " where " + filterPart + "\")"); + sqlList.add("communalDf.cache()"); + sqlList.add("communalDf.createOrReplaceTempView(\"" + commonTableName + "\")"); + sqlList.add("spark.catalog.cacheTable(\"" + commonTableName + "\")"); + } + return sqlList; + } + + private List generateShellSqlByTask(Rule rule, Date date, String applicationId, String createTime, StringBuilder partition, int count + , String runDate, List leftCols, List rightCols, List complexCols, String queueName, String createUser) throws Exception { + + List sqlList = new ArrayList<>(); + + // Get input meta from template + List inputMetaRuleVariables = rule.getRuleVariables().stream().filter(ruleVariable -> ruleVariable.getInputActionStep().equals(InputActionStepEnum.TEMPLATE_INPUT_META.getCode())).collect(Collectors.toList()); + String templateMidTableAction = rule.getTemplate().getMidTableAction().replace("\n", " "); + Map ruleMetricMap = collectRuleMetric(rule); + Map dbTableMap = new HashMap<>(4); + Map filters = new HashMap<>(2); + StringBuilder realFilter = new StringBuilder(); + StringBuilder realColumn = new StringBuilder(); + + templateMidTableAction = getMultiDatasourceFiltesAndUpdateMidTableAction(rule, templateMidTableAction, date, filters); + replaceVariable(templateMidTableAction, inputMetaRuleVariables, partition.toString(), realFilter, realColumn, dbTableMap, date, createUser); + + // If partition is not specified, replace with filter in rule configuration. + if (StringUtils.isBlank(partition.toString())) { + fillPartitionWithRuleConfiguration(partition, rule, templateMidTableAction, inputMetaRuleVariables); + } + + StringBuilder createFunc = new StringBuilder(); + if (Boolean.TRUE.equals(taskDataSourceConfig.getHiveSortUdfOpen()) && CollectionUtils.isNotEmpty(complexCols)) { + createFunc.append("CREATE TEMPORARY FUNCTION ").append(taskDataSourceConfig.getHiveSortUdf()).append(" AS '") + .append(taskDataSourceConfig.getHiveSortUdfClassPath()).append("' USING JAR '") + .append(taskDataSourceConfig.getHiveSortUdfLibPath()).append("';"); + } + + StringBuilder leftConcat = new StringBuilder(); + for (String col : leftCols) { + if (Boolean.TRUE.equals(taskDataSourceConfig.getHiveSortUdfOpen()) && CollectionUtils.isNotEmpty(complexCols) && complexCols.contains(col)) { + leftConcat.append("nvl(").append(taskDataSourceConfig.getHiveSortUdf()).append("(").append(col).append("),''),"); + continue; + } + leftConcat.append("nvl(cast(").append(col).append(" as string),''),"); + } + leftConcat.deleteCharAt(leftConcat.length() - 1); + StringBuilder rightConcat = new StringBuilder(); + for (String col : rightCols) { + if (Boolean.TRUE.equals(taskDataSourceConfig.getHiveSortUdfOpen()) && CollectionUtils.isNotEmpty(complexCols) && complexCols.contains(col)) { + rightConcat.append("nvl(").append(taskDataSourceConfig.getHiveSortUdf()).append("(").append(col).append("),''),"); + continue; + } + rightConcat.append("nvl(cast(").append(col).append(" as string),''),"); + } + rightConcat.deleteCharAt(rightConcat.length() - 1); + StringBuilder setQueue = new StringBuilder(); + if (StringUtils.isNotEmpty(queueName)) { + setQueue.append("set mapreduce.job.queuename=").append(queueName).append(";"); + } + String hiveSql = "count_result_" + count + "=`hive -S -e \"" + setQueue.toString() + createFunc.toString() + "select count(1) as diff_count from (select line_md5, count(1) as md5_count from (select md5(concat_ws(''," + leftConcat.toString() + ")) as line_md5 from " + dbTableMap.get("left_database") + dbTableMap.get("left_table") + " where " + filters.get("left_table") + ") left_tmp group by left_tmp.line_md5) qulaitis_left_tmp ${contrast_type} (select line_md5, count(1) as md5_count from (select md5(concat_ws(''," + rightConcat.toString() + ")) as line_md5 from " + dbTableMap.get("right_database") + dbTableMap.get("right_table") + " where " + filters.get("right_table") + ") right_tmp group by right_tmp.line_md5) qulaitis_right_tmp ON (qulaitis_left_tmp.line_md5 = qulaitis_right_tmp.line_md5 AND qulaitis_left_tmp.md5_count = qulaitis_right_tmp.md5_count) where (qulaitis_left_tmp.line_md5 is null AND qulaitis_left_tmp.md5_count is null) OR (qulaitis_right_tmp.line_md5 is null AND qulaitis_right_tmp.md5_count is null) ${outer_filter};\"`"; + hiveSql = hiveSql.replace("${contrast_type}", ContrastTypeEnum.getJoinType(rule.getContrastType())); + if (StringUtils.isNotEmpty(partition.toString())) { + hiveSql = hiveSql.replace("${outer_filter}", "AND (" + partition.toString() + ")"); + } else { + hiveSql = hiveSql.replace("${outer_filter}", ""); + } + sqlList.add(hiveSql); + String toArr = "arr_" + count + "=(${count_result_" + count + "// /})"; + sqlList.add(toArr); + String getLastIndex = "lastIndex_" + count + "=$((${#arr_" + count + "[@]}-1))"; + sqlList.add(getLastIndex); + String getCountValue = "count_value_" + count + "=`echo ${arr_" + count + "[lastIndex_" + count + "]}`"; + sqlList.add(getCountValue); + String mysqlConn = "MYSQL=\"" + taskDataSourceConfig.getMysqlsec() + "\""; + sqlList.add(mysqlConn); + String ruleVersion = rule.getWorkFlowVersion() == null ? "" : rule.getWorkFlowVersion(); + if (StringUtils.isEmpty(runDate)) { + runDate = "-1"; + } + String insertSql = "sql_" + count + "=\"INSERT INTO qualitis_application_task_result (application_id, create_time, result_type, rule_id, value, rule_metric_id, run_date, version) VALUES('" + applicationId + "', '" + createTime + "', 'Long', " + rule.getId() + ", $count_value_" + count + ", -1, " + runDate + ", '" + ruleVersion + "');\""; + if (CollectionUtils.isNotEmpty(ruleMetricMap.values())) { + insertSql = "sql_" + count + "=\"INSERT INTO qualitis_application_task_result (application_id, create_time, result_type, rule_id, value, rule_metric_id, run_date, version) VALUES('" + applicationId + "', '" + createTime + "', 'Long', " + rule.getId() + ", $count_value_" + count + ", " + ruleMetricMap.values().iterator().next() + ", " + runDate + ", '" + ruleVersion + "');\""; + } + sqlList.add(insertSql); + sqlList.add("result=\"$($MYSQL -e\"$sql_" + count + "\")\""); + return sqlList; + } + + private List dropHiveTable(Rule rule) { + List codes = new ArrayList<>(); + LOGGER.info("Drop fps temp table after select."); + for (RuleDataSource ruleDataSource : rule.getRuleDataSources()) { + if (StringUtils.isNotBlank(ruleDataSource.getFileId())) { + codes.add(FPS_DROP_TABLE.replace("{FPS_DB}", ruleDataSource.getDbName()).replace("{FPS_TALBE}", ruleDataSource.getTableName())); + } + } + return codes; + } + + /** + * Fps file to hive persistant table. Construct code by call linkis api of 'HDFS file to hive'. + * + * @param ruleDataSource + * @param user + * @param partOfVariableName + * @param count + * @return + * @throws IOException + */ + private List generateHiveTable(RuleDataSource ruleDataSource, String user, String partOfVariableName, int count) { + List res = new ArrayList<>(4); + if (StringUtils.isNotBlank(ruleDataSource.getFileId())) { + LOGGER.info("Start to generate fps to hive code"); + + String sourcePrefix = "tmpSource_"; + String destinationPrefix = "tmpDestination_"; + + String source = FPS_SOURCE.replace("{TMP_SOURCE}", sourcePrefix + partOfVariableName + count) + .replace("{DATA_NOW}", DateUtils.now("yyyyMMdd")) + .replace("{FPS_TALBE}", ruleDataSource.getTableName()) + .replace("{FPS_TYPE}", ruleDataSource.getFileType()) + .replace("{SHEET_NAME}", ruleDataSource.getFileSheetName()) + .replace("{FILE_DELIMITER}", ruleDataSource.getFileDelimiter()) + .replace("{FILE_HEADER}", ruleDataSource.getFileHeader().toString().toLowerCase()); + + source = source.replace("{USER}", user); + + LOGGER.info("Finish to concat hive source: " + source); + List fpsColumnInfos = new ArrayList<>(); + String columnInfo = ruleDataSource.getFileTableDesc(); + int index = 0; + for (String column : columnInfo.split(SpecCharEnum.COMMA.getValue())) { + String name = column.split(":")[0]; + String type = column.split(":")[1]; + if (type.contains("date")) { + String[] dateType = type.split("_"); + FpsColumnInfo fpsColumnInfo = new FpsColumnInfo(name, index, "", dateType[0], DATE_FORMAT.get(dateType[1])); + fpsColumnInfos.add(fpsColumnInfo); + index++; + continue; + } + FpsColumnInfo fpsColumnInfo = new FpsColumnInfo(name, index, "", type, ""); + fpsColumnInfos.add(fpsColumnInfo); + index++; + } + String destination = FPS_DESTINATION.replace("{TMP_DESTINATION}", destinationPrefix + partOfVariableName + count) + .replace("{FPS_DB}", ruleDataSource.getDbName()) + .replace("{FPS_TALBE}", ruleDataSource.getTableName()) + .replace("{COLUMN_LIST}", CustomObjectMapper.transObjectToJson(fpsColumnInfos)); + LOGGER.info("hive destination: " + destination); + res.add(source); + res.add(destination); + res.add(FPS_IMPORT.replace("{TMP_SOURCE}", sourcePrefix + partOfVariableName + count).replace("{TMP_DESTINATION}", destinationPrefix + partOfVariableName + count)); + } + return res; + } + + /** + * Fps file to hive temp table with spark session. + * + * @return + */ + public List generateTempHiveTable(Rule rule, String user) { + List res = new ArrayList<>(4); + String partOfVariableName = rule.getProject().getName() + SpecCharEnum.BOTTOM_BAR.getValue() + rule.getName(); + int count = 0; + for (RuleDataSource ruleDataSource : rule.getRuleDataSources()) { + if (StringUtils.isNotBlank(ruleDataSource.getFileId())) { + if (".xlsx".equals(ruleDataSource.getFileType()) || ".xls".equals(ruleDataSource.getFileType())) { + res.addAll(generateHiveTable(ruleDataSource, user, partOfVariableName, count)); + count ++; + continue; + } + String dfPrefix = "tmpDF_"; + String varDf = dfPrefix + partOfVariableName; + String realPath = FPS_FILE_PATH.replace("{DATA_NOW}", DateUtils.now("yyyyMMdd")) + .concat(ruleDataSource.getTableName()).concat(ruleDataSource.getFileType()); + realPath = realPath.replace("{USER}", user); + if (ruleDataSource.getFileHeader()) { + res.add(FPS_TO_HIVE_WITH_HEADER.replace("{DF}", varDf) + .replace("{DELIMITER}", ruleDataSource.getFileDelimiter()) + .replace("{HDFS_PREFIX}", fpsConfig.getHdfsPrefix()) + .replace("{FPS_FILE_PATH}", realPath) + ); + } else { + res.add(IMPORT_SCHEMA); + String varSchema = "tmpSchema_" + partOfVariableName; + StringBuilder schemaCode = new StringBuilder(CONSTRUCT_SCHEMA.replace("{SCHEMA}", varSchema)); + for (String column : ruleDataSource.getFileTableDesc().split(SpecCharEnum.COMMA.getValue())) { + String name = column.split(":")[0]; + String type = column.split(":")[1]; + schemaCode.append(CONSTRUCT_FIELD.replace("{FIELD_NAME}", name).replace("{FIELD_TYPE}", FIELD_TYPE.get(type))); + } + res.add(schemaCode.toString()); + res.add(FPS_TO_HIVE_WITH_SCHEMA.replace("{DF}", varDf) + .replace("{DELIMITER}", ruleDataSource.getFileDelimiter()) + .replace("{SCHEMA}", varSchema) + .replace("{HDFS_PREFIX}", fpsConfig.getHdfsPrefix()) + .replace("{FPS_FILE_PATH}", realPath) + ); + } + res.add(DF_REGISTER.replace("{DF}", varDf).replace("{FILE_NAME}", ruleDataSource.getTableName())); + } + } + return res; + } + /** * Convert task into scala code + * + * @param job * @param rule * @param date * @param applicationId @@ -236,134 +619,417 @@ public DataQualityJob convert(DataQualityTask dataQualityTask, Date date, String * @param createTime * @param partition * @param execParams - * @param count * @param runDate * @param dataSourceMysqlConnect + * @param midTableReUse + * @param unionAllForSaveResult + * @param leftCols + * @param rightCols + * @param complexCols + * @param shareConnect + * @param shareFromPart * @return * @throws ConvertException * @throws RuleVariableNotSupportException * @throws RuleVariableNotFoundException */ - private List generateSparkSqlByTask(Rule rule, Date date, String applicationId, String midTableName, String createTime, StringBuffer partition - , Map execParams, int count, String runDate, Map dataSourceMysqlConnect) - throws ConvertException, RuleVariableNotSupportException, RuleVariableNotFoundException, UnExpectedRequestException { - + private List generateSparkSqlByTask(DataQualityJob job, Rule rule, Date date, String applicationId, String midTableName, String createTime + , StringBuilder partition, Map execParams, String runDate, Map>> dataSourceMysqlConnect, String user + , boolean midTableReUse, boolean unionAllForSaveResult, List leftCols, List rightCols, List complexCols, String createUser, boolean shareConnect, String shareFromPart) throws ConvertException, RuleVariableNotSupportException, RuleVariableNotFoundException, UnExpectedRequestException, MetaDataAcquireFailedException { List sqlList = new ArrayList<>(); + Map filters = new HashMap<>(2); + // Collect rule metric and build in save sentence sql. - List ruleMetrics = rule.getAlarmConfigs().stream().map(AlarmConfig::getRuleMetric).distinct().collect(Collectors.toList()); - Map ruleMetricMap = new HashMap<>(ruleMetrics.size()); - if (CollectionUtils.isNotEmpty(ruleMetrics)) { - LOGGER.info("Start to get rule metric for task result save. Rule metrics: {}", Arrays.toString(ruleMetrics.toArray())); - for (RuleMetric ruleMetric : ruleMetrics) { - if (ruleMetric != null) { - ruleMetricMap.put(ruleMetric.getName(), ruleMetric.getId()); - } - } - LOGGER.info("Finish to get rule metric for task result save."); - } + Map ruleMetricMap = collectRuleMetric(rule); + // Get SQL from template after remove '\n' - String templateMidTableAction = rule.getTemplate().getMidTableAction().replace("\n"," "); + String templateMidTableAction = rule.getTemplate().getMidTableAction().replace("\n", " "); + String templateEnName = StringUtils.isNotEmpty(rule.getTemplate().getEnName()) ? rule.getTemplate().getEnName() : "defaultCheckDF"; - Map filters = new HashMap<>(2); - if (CUSTOM_RULE.intValue() == rule.getRuleType()) { + if (MUL_SOURCE_RULE.intValue() == rule.getRuleType()) { + templateMidTableAction = getMultiDatasourceFiltesAndUpdateMidTableAction(rule, templateMidTableAction, date, filters); + } else if (CUSTOM_RULE.intValue() == rule.getRuleType()) { templateMidTableAction = customMidTableActionUpdate(rule, templateMidTableAction, date, execParams, partition, ruleMetricMap); - } else if (MUL_SOURCE_RULE.intValue() == rule.getRuleType()) { - templateMidTableAction = multiMidTableActionUpdate(rule, templateMidTableAction, date, filters); + templateEnName = "customCheckDF"; } - // Get input meta from template - List inputMetaRuleVariables = rule.getRuleVariables().stream().filter( - ruleVariable -> ruleVariable.getInputActionStep().equals(InputActionStepEnum.TEMPLATE_INPUT_META.getCode())).collect(Collectors.toList()); - // If partition is not specified, replace with filter in rule configuration. + // Get statistics meta + List statisticsRuleVariables = rule.getRuleVariables().stream().filter(ruleVariable -> ruleVariable.getInputActionStep().equals(InputActionStepEnum.STATISTICS_ARG.getCode())).collect(Collectors.toList()); + // Get select input meta + List inputMetaRuleVariables = rule.getRuleVariables().stream().filter(ruleVariable -> ruleVariable.getInputActionStep().equals(InputActionStepEnum.TEMPLATE_INPUT_META.getCode())).collect(Collectors.toList()); + + // If partition is not specified, replace with filter from rule datasource. if (StringUtils.isBlank(partition.toString())) { - templateMidTableAction = fillPartitionWithRuleConfiguration(partition, rule, templateMidTableAction, inputMetaRuleVariables); + fillPartitionWithRuleConfiguration(partition, rule, templateMidTableAction, inputMetaRuleVariables); } + // Get dbs and tables Map dbTableMap = new HashMap<>(4); - // Get mappings - StringBuffer mappings = new StringBuffer(); - StringBuffer realFilter = new StringBuffer(); - // Get SQL From template and replace all replaceholders - String midTableAction = replaceVariable(templateMidTableAction, inputMetaRuleVariables, partition.toString(), realFilter, dbTableMap, mappings, date); - - Set templateStatisticsAction = rule.getTemplate().getStatisticAction(); - Map sourceConnect = new HashMap(8); - Map targetConnect = new HashMap(8); + // Get column and filter + StringBuilder realColumn = new StringBuilder(); + StringBuilder realFilter = new StringBuilder(); + // Get template sql and replace all replaceholders + String midTableAction = replaceVariable(templateMidTableAction, inputMetaRuleVariables, partition.toString(), realFilter, realColumn, dbTableMap, date, createUser); + + // Prepare for multiple rule + List> sourceConnect = new ArrayList<>(); + List> targetConnect = new ArrayList<>(); + prepareDecrptedConnectParamForMultipleRule(sourceConnect, targetConnect, dataSourceMysqlConnect, rule); + + Map selectResult = new LinkedHashMap<>(rule.getRuleDataSources().size()); + String partOfVariableName = templateEnName.replace(" ", "") + SpecCharEnum.EQUAL.getValue() + rule.getName(); + handleRuleSelectSql(rule, midTableName, partition, partOfVariableName, runDate, dataSourceMysqlConnect, sqlList, filters, dbTableMap, midTableAction, sourceConnect, targetConnect, selectResult, midTableReUse, unionAllForSaveResult, leftCols, rightCols, complexCols, shareConnect, shareFromPart); + + Set templateMidTableInputMetas = rule.getTemplate().getTemplateMidTableInputMetas(); + boolean saveNewValue = templateMidTableInputMetas.stream().anyMatch(templateMidTableInputMeta -> Boolean.TRUE.equals(templateMidTableInputMeta.getWhetherNewValue())); + boolean numRangeNewValue = saveNewValue && templateMidTableInputMetas.stream().anyMatch(templateMidTableInputMeta -> TemplateInputTypeEnum.INTERMEDIATE_EXPRESSION.getCode().equals(templateMidTableInputMeta.getInputType())); + boolean enumListNewValue = saveNewValue && templateMidTableInputMetas.stream().anyMatch(templateMidTableInputMeta -> TemplateInputTypeEnum.LIST.getCode().equals(templateMidTableInputMeta.getInputType()) || TemplateInputTypeEnum.STANDARD_VALUE_EXPRESSION.getCode().equals(templateMidTableInputMeta.getInputType())); + sqlList.addAll(saveStatisticAndSaveMySqlSentence(rule.getWorkFlowVersion() != null ? rule.getWorkFlowVersion() : "", rule.getId(), ruleMetricMap, rule.getTemplate().getStatisticAction(), applicationId, statisticsRuleVariables, createTime, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1], runDate, user, realColumn, enumListNewValue, numRangeNewValue, selectResult, unionAllForSaveResult)); + job.setResultNum(selectResult.size()); + return sqlList; + } + + private void prepareDecrptedConnectParamForMultipleRule(List> sourceConnect, List> targetConnect, Map>> dataSourceMysqlConnect, Rule rule) + throws UnExpectedRequestException { if (dataSourceMysqlConnect != null && dataSourceMysqlConnect.size() > 0) { for (RuleDataSource ruleDataSource : rule.getRuleDataSources()) { - Map connectParams = dataSourceMysqlConnect.get(ruleDataSource.getId()); + // Handle multiple datasource connect params. + List> connectParams = dataSourceMysqlConnect.get(ruleDataSource.getId()); if (connectParams == null) { continue; } if (ruleDataSource.getDatasourceIndex() != null && ruleDataSource.getDatasourceIndex().equals(0)) { // If mysql sec, decrypt password and user name. - sourceConnect = dataSourceMysqlConnect.get(ruleDataSource.getId()); + sourceConnect.addAll(decryptMysqlInfo(connectParams)); } if (ruleDataSource.getDatasourceIndex() != null && ruleDataSource.getDatasourceIndex().equals(1)) { // If mysql sec, decrypt password and user name. - targetConnect = dataSourceMysqlConnect.get(ruleDataSource.getId()); + targetConnect.addAll(decryptMysqlInfo(connectParams)); } } } - sqlList.add("val UUID = java.util.UUID.randomUUID.toString"); - // 跨表规则 - if (RuleTemplateTypeEnum.MULTI_SOURCE_TEMPLATE.getCode().equals(rule.getTemplate().getTemplateType()) && dbTableMap.size() > 0) { - // Import sql function. - sqlList.addAll(getImportSql()); - // Generate UUID. - // Transform original table. - Set columns = new HashSet<>(); - if (rule.getTemplate().getId().longValue() == MUL_SOURCE_ACCURACY_TEMPLATE_ID.longValue()) { - // Get accuracy columns. - columns = rule.getRuleDataSourceMappings().stream().map(RuleDataSourceMapping::getLeftColumnNames) - .map(column -> column.replace("tmp1.", "").replace("tmp2.", "")).collect(Collectors.toSet()); + } + + private Map collectRuleMetric(Rule rule) { + List ruleMetrics = rule.getAlarmConfigs().stream().map(AlarmConfig::getRuleMetric).distinct().collect(Collectors.toList()); + Map ruleMetricMap = new HashMap<>(ruleMetrics.size()); + if (CollectionUtils.isNotEmpty(ruleMetrics)) { + LOGGER.info("Start to get rule metric for task result save. Rule metrics: {}", Arrays.toString(ruleMetrics.toArray())); + for (RuleMetric ruleMetric : ruleMetrics) { + if (ruleMetric != null) { + ruleMetricMap.put(ruleMetric.getName(), ruleMetric.getId()); + } } - if (rule.getTemplate().getId().longValue() == MUL_SOURCE_COMMON_TEMPLATE_ID.longValue()) { - sqlList.addAll(getCommonTransformSql(dbTableMap, mappings, count, partition.toString(), filters, sourceConnect, targetConnect)); + LOGGER.info("Finish to get rule metric for task result save."); + } + return ruleMetricMap; + } + + private void handleRuleSelectSql(Rule rule, String midTableName, StringBuilder partition, String partOfVariableName, String runDate, Map>> dataSourceMysqlConnect + , List sqlList, Map filters, Map dbTableMap, String midTableAction, List> sourceConnect, List> targetConnect + , Map selectResult, boolean midTableReUse, boolean unionAllForSaveResult, List leftCols, List rightCols, List complexCols, boolean shareConnect, String shareFromPart) throws UnExpectedRequestException { + + boolean systemCompareTemplate = rule.getTemplate().getId().longValue() == MUL_SOURCE_ACCURACY_TEMPLATE_ID.longValue() || rule.getTemplate().getId().longValue() == MUL_SOURCE_FULL_TEMPLATE_ID.longValue(); + if (systemCompareTemplate && dbTableMap.size() > 0) { + if (rule.getTemplate().getId().longValue() == MUL_SOURCE_ACCURACY_TEMPLATE_ID.longValue()) { + if (CollectionUtils.isNotEmpty(sourceConnect) && CollectionUtils.isNotEmpty(targetConnect)) { + for (Iterator> sourceIterator = sourceConnect.iterator(), targetIterator = targetConnect.iterator(); sourceIterator.hasNext() && targetIterator.hasNext(); ) { + Map sourceConnectMap = sourceIterator.next(); + Map targetConnectMap = targetIterator.next(); + String sourceEnvName = (String) sourceConnectMap.get("envName"); + String targetEnvName = (String) targetConnectMap.get("envName"); + if (StringUtils.isEmpty(sourceEnvName) || StringUtils.isEmpty(targetEnvName)) { + continue; + } + sqlList.addAll(getMultiSourceAccuracyfromSql(midTableAction, dbTableMap, filters, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + sourceEnvName + targetEnvName, sourceConnectMap, targetConnectMap, selectResult)); + } + String lastVariable = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + "Last"); + unionAllSaveResult(lastVariable, selectResult, sqlList, unionAllForSaveResult); + } else if (CollectionUtils.isNotEmpty(sourceConnect) && CollectionUtils.isEmpty(targetConnect)) { + for (Iterator> sourceIterator = sourceConnect.iterator(); sourceIterator.hasNext(); ) { + Map sourceConnectMap = sourceIterator.next(); + String sourceEnvName = (String) sourceConnectMap.get("envName"); + if (StringUtils.isEmpty(sourceEnvName)) { + continue; + } + sqlList.addAll(getMultiSourceAccuracyfromSql(midTableAction, dbTableMap, filters, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + sourceEnvName, sourceIterator.next(), null, selectResult)); + } + String lastVariable = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + "Last"); + unionAllSaveResult(lastVariable, selectResult, sqlList, unionAllForSaveResult); + } else if (CollectionUtils.isNotEmpty(targetConnect) && CollectionUtils.isEmpty(sourceConnect)) { + for (Iterator> targetIterator = targetConnect.iterator(); targetIterator.hasNext(); ) { + Map targetConnectMap = targetIterator.next(); + String targetEnvName = (String) targetConnectMap.get("envName"); + if (StringUtils.isEmpty(targetEnvName)) { + continue; + } + sqlList.addAll(getMultiSourceAccuracyfromSql(midTableAction, dbTableMap, filters, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + targetEnvName, null, targetIterator.next(), selectResult)); + } + String lastVariable = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + "Last"); + unionAllSaveResult(lastVariable, selectResult, sqlList, unionAllForSaveResult); + } else { + sqlList.addAll(getMultiSourceAccuracyfromSql(midTableAction, dbTableMap, filters, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1], null, null, selectResult)); + + if (Boolean.TRUE.equals(rule.getTemplate().getSaveMidTable())) { + sqlList.addAll(getSaveMidTableSentenceSettings()); + sqlList.addAll(getSaveMidTableSentence(midTableName, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1], runDate, midTableReUse)); + } + } + } else if (rule.getTemplate().getId().longValue() == MUL_SOURCE_FULL_TEMPLATE_ID.longValue()) { + sqlList.add("val UUID = java.util.UUID.randomUUID.toString"); + // Import sql function. + sqlList.addAll(getImportSql()); + List columns = new ArrayList<>(); + String columnsInfo = rule.getRuleDataSources().stream().filter(ruleDataSource -> QualitisConstants.LEFT_INDEX.equals(ruleDataSource.getDatasourceIndex())).iterator().next().getColName(); + + if (StringUtils.isNotEmpty(columnsInfo)) { + String[] realColumns = columnsInfo.split(SpecCharEnum.VERTICAL_BAR.getValue()); + for (String column : realColumns) { + String[] colInfo = column.split(SpecCharEnum.COLON.getValue()); + String colName = colInfo[0]; + String colType = colInfo[1]; + boolean needSort = Boolean.TRUE.equals(taskDataSourceConfig.getHiveSortUdfOpen()) && + (colType.toLowerCase().startsWith(QualitisConstants.MAP_TYPE) || colType.toLowerCase() + .startsWith(QualitisConstants.ARRAY_TYPE) || colType.toLowerCase().startsWith(QualitisConstants.STRUCT_TYPE)); + if (needSort) { + columns.add(taskDataSourceConfig.getHiveSortUdf() + "(" + colName + ")"); + } else { + columns.add(colName); + } + } + } + if (CollectionUtils.isNotEmpty(sourceConnect) && CollectionUtils.isNotEmpty(targetConnect)) { + for (Iterator> sourceIterator = sourceConnect.iterator(), targetIterator = targetConnect.iterator(); sourceIterator.hasNext() && targetIterator.hasNext(); ) { + Map sourceConnectMap = sourceIterator.next(); + Map targetConnectMap = targetIterator.next(); + String sourceEnvName = (String) sourceConnectMap.get("envName"); + String targetEnvName = (String) targetConnectMap.get("envName"); + if (StringUtils.isEmpty(sourceEnvName) || StringUtils.isEmpty(targetEnvName)) { + continue; + } + sqlList.addAll(getSpecialTransformSql(dbTableMap, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + sourceEnvName + targetEnvName, partition.toString(), filters, Strings.join(columns, ',') + , sourceIterator.next(), targetIterator.next(), rule.getContrastType(), leftCols, rightCols, complexCols, selectResult)); + } + String lastVariable = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + "Last"); + unionAllSaveResult(lastVariable, selectResult, sqlList, unionAllForSaveResult); + } else if (CollectionUtils.isNotEmpty(sourceConnect) && CollectionUtils.isEmpty(targetConnect)) { + for (Iterator> sourceIterator = sourceConnect.iterator(); sourceIterator.hasNext(); ) { + Map sourceConnectMap = sourceIterator.next(); + String sourceEnvName = (String) sourceConnectMap.get("envName"); + if (StringUtils.isEmpty(sourceEnvName)) { + continue; + } + sqlList.addAll(getSpecialTransformSql(dbTableMap, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + sourceEnvName, partition.toString(), filters, Strings.join(columns, ',') + , sourceIterator.next(), null, rule.getContrastType(), leftCols, rightCols, complexCols, selectResult)); + } + String lastVariable = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + "Last"); + unionAllSaveResult(lastVariable, selectResult, sqlList, unionAllForSaveResult); + } else if (CollectionUtils.isEmpty(sourceConnect) && CollectionUtils.isNotEmpty(targetConnect)) { + for (Iterator> targetIterator = targetConnect.iterator(); targetIterator.hasNext(); ) { + Map targetConnectMap = targetIterator.next(); + String targetEnvName = (String) targetConnectMap.get("envName"); + if (StringUtils.isEmpty(targetEnvName)) { + continue; + } + sqlList.addAll(getSpecialTransformSql(dbTableMap, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + targetEnvName, partition.toString(), filters, Strings.join(columns, ',') + , null, targetIterator.next(), rule.getContrastType(), leftCols, rightCols, complexCols, selectResult)); + } + + String lastVariable = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + "Last"); + unionAllSaveResult(lastVariable, selectResult, sqlList, unionAllForSaveResult); + } else { + sqlList.addAll(getSpecialTransformSql(dbTableMap, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1], partition.toString(), filters, Strings.join(columns, ',') + , null, null, rule.getContrastType(), leftCols, rightCols, complexCols, selectResult)); + + if (Boolean.TRUE.equals(rule.getTemplate().getSaveMidTable())) { + sqlList.addAll(getSaveMidTableSentenceSettings()); + sqlList.addAll(getSaveMidTableSentence(midTableName, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1], runDate, midTableReUse)); + } + } } else { - sqlList.addAll(getSpecialTransformSql(dbTableMap, count, partition.toString(), filters, Strings.join(columns, ','), sourceConnect, targetConnect)); - if (optimizationConfig.getLightweightQuery()) { - count += 3; + if (CollectionUtils.isNotEmpty(sourceConnect) && CollectionUtils.isNotEmpty(targetConnect)) { + for (Iterator> sourceIterator = sourceConnect.iterator(), targetIterator = targetConnect.iterator(); sourceIterator.hasNext() && targetIterator.hasNext(); ) { + Map sourceConnectMap = sourceIterator.next(); + Map targetConnectMap = targetIterator.next(); + String sourceEnvName = (String) sourceConnectMap.get("envName"); + String targetEnvName = (String) targetConnectMap.get("envName"); + if (StringUtils.isEmpty(sourceEnvName) || StringUtils.isEmpty(targetEnvName)) { + continue; + } + sqlList.addAll(getMultiSourceAccuracyfromSql(midTableAction, dbTableMap, filters, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + sourceEnvName + targetEnvName, sourceIterator.next(), targetIterator.next(), selectResult)); + } + String lastVariable = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + "Last"); + unionAllSaveResult(lastVariable, selectResult, sqlList, unionAllForSaveResult); + } else { + sqlList.addAll(getMultiSourceAccuracyfromSql(midTableAction, dbTableMap, filters, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1], null, null, selectResult)); + + if (Boolean.TRUE.equals(rule.getTemplate().getSaveMidTable())) { + sqlList.addAll(getSaveMidTableSentenceSettings()); + sqlList.addAll(getSaveMidTableSentence(midTableName, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1], runDate, midTableReUse)); + } } } - if (rule.getTemplate().getSaveMidTable()) { - sqlList.addAll(getSaveMidTableSentenceSettings()); - sqlList.addAll(getSaveMidTableSentence(midTableName, count, runDate)); - } } else { // Generate select statement and save into hive database - RuleDataSource ruleDataSource = rule.getRuleDataSources().stream().filter(dataSource -> dataSource.getDatasourceIndex() == null).iterator().next(); - Map connParams = dataSourceMysqlConnect.get(ruleDataSource.getId()); - if (connParams != null) { - connParams = dataSourceMysqlConnect.get(ruleDataSource.getId()); + List> tableEnvs; + RuleDataSource ruleDataSource; + StringBuilder filterFields = new StringBuilder(); + if (RuleTemplateTypeEnum.SINGLE_SOURCE_TEMPLATE.getCode().equals(rule.getRuleType())) { + tableEnvs = null; + ruleDataSource = rule.getRuleDataSources().stream().filter(dataSource -> dataSource.getDatasourceIndex() == null).iterator().next(); + if (Boolean.TRUE.equals(rule.getTemplate().getFilterFields())) { + String filterColName = ruleDataSource.getColName(); + List filterColNameList = new ArrayList<>(); + if (StringUtils.isNotEmpty(filterColName)) { + String[] realColumns = filterColName.split(SpecCharEnum.VERTICAL_BAR.getValue()); + for (String column : realColumns) { + String[] colInfo = column.split(SpecCharEnum.COLON.getValue()); + String colName = colInfo[0]; + filterColNameList.add("\"" + colName + "\""); + } + filterFields.append(".select(").append(Strings.join(filterColNameList, ',')).append(")"); + } + } + } else { + ruleDataSource = rule.getRuleDataSources().stream().filter(dataSource -> dataSource.getDatasourceIndex() != null).iterator().next(); + List parsedRuleDataSource = rule.getRuleDataSources().stream().filter(dataSource -> dataSource.getDatasourceIndex() == null).collect(Collectors.toList()); + + tableEnvs = new ArrayList<>(parsedRuleDataSource.size()); + midTableAction = preSelectEnvsSql(ruleDataSource, parsedRuleDataSource, tableEnvs, dataSourceMysqlConnect, sqlList, midTableAction); } + List> decryptedMysqlInfo; - sqlList.addAll(generateSparkSqlAndSaveSentence(midTableAction, midTableName, rule.getTemplate(), count, connParams, runDate)); - count ++; + if (shareConnect) { + decryptedMysqlInfo = dataSourceMysqlConnect.get(ruleDataSource.getId()); + } else { + decryptedMysqlInfo = decryptMysqlInfo(dataSourceMysqlConnect.get(ruleDataSource.getId())); + } + sqlList.addAll(generateSparkSqlAndSaveSentence(midTableAction, midTableName, rule, partOfVariableName, decryptedMysqlInfo, runDate, selectResult, midTableReUse, unionAllForSaveResult, filterFields.toString(), tableEnvs, shareConnect, shareFromPart)); } + } - // Generate statistics statement, and save into mysql - List statisticsRuleVariables = rule.getRuleVariables().stream().filter( - ruleVariable -> ruleVariable.getInputActionStep().equals(InputActionStepEnum.STATISTICS_ARG.getCode())).collect(Collectors.toList()); + private String preSelectEnvsSql(RuleDataSource ruleDataSource, List parsedRuleDataSource, List> tableEnvs, Map>> dataSourceMysqlConnect + , List sqlList, String midTableAction) throws UnExpectedRequestException { + + List ruleDataSourceEnvs = ruleDataSource.getRuleDataSourceEnvs(); + if (CollectionUtils.isNotEmpty(ruleDataSourceEnvs)) { + Set ruleDataSourceEnvsForSelect = ruleDataSourceEnvs.stream().filter(ruleDataSourceEnv -> StringUtils.isNotEmpty(ruleDataSourceEnv.getDbAndTable())).collect(Collectors.toSet()); + + if (CollectionUtils.isNotEmpty(ruleDataSourceEnvsForSelect)) { + + List> decryptedMysqlInfo = decryptMysqlInfo(dataSourceMysqlConnect.get(ruleDataSource.getId())); + + LOGGER.info("Start to replace env mappings with register temp table."); + + for (RuleDataSource currentRuleDataSource : parsedRuleDataSource) { + String dbAliasName = currentRuleDataSource.getDbName(); + String tableName = currentRuleDataSource.getTableName(); + + List currentRuleDataSourceEnvs = ruleDataSourceEnvsForSelect.stream().filter(ruleDataSourceEnv -> + currentRuleDataSource.getDbName().equals(ruleDataSourceEnv.getDbAndTable().split(SpecCharEnum.PERIOD.getValue())[1])) + .collect(Collectors.toList()); + + List varEnvs = new ArrayList<>(currentRuleDataSourceEnvs.size()); + for (RuleDataSourceEnv ruleDataSourceEnv : currentRuleDataSourceEnvs) { + String envName = ruleDataSourceEnv.getEnvName(); + String tmp = "select * from " + ruleDataSourceEnv.getDbAndTable().split(SpecCharEnum.PERIOD.getValue())[0] + SpecCharEnum.PERIOD_NO_ESCAPE.getValue() + tableName; + String tmpTableName = ruleDataSourceEnv.getDbAndTable().split(SpecCharEnum.PERIOD.getValue())[0] + tableName + "_" + ruleDataSource.getId() + "_" + ruleDataSourceEnv.getId(); + Map connParams = decryptedMysqlInfo.stream().filter(map -> envName.equals(map.get("envName"))).iterator().next(); + String host = (String) connParams.get("host"); + String port = (String) connParams.get("port"); + String pwd = (String) connParams.get("password"); + String user = (String) connParams.get("username"); + String dataType = (String) connParams.get("dataType"); + String selectStr = SPARK_MYSQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, tmp) + .replace("${JDBC_DRIVER}", JDBC_DRIVER.get(dataType)) + .replace("${MYSQL_IP}", host) + .replace("${MYSQL_PORT}", port) + .replace("${MYSQL_USER}", user) + .replace("${MYSQL_PASSWORD}", pwd) + .replace(VARIABLE_NAME_PLACEHOLDER, tmpTableName); + + String tmpTable = "qualitis_tmp_table_" + tmpTableName; + String regStr = tmpTableName + ".registerTempTable(\"" + tmpTable + "\")"; + LOGGER.info("Select sql [{}] added to sql list for env.", selectStr); + LOGGER.info("Reg sql [{}] added to sql list for env.", regStr); + sqlList.add(selectStr); + sqlList.add(regStr); + + varEnvs.add(dbAliasName + SpecCharEnum.PERIOD_NO_ESCAPE.getValue() + tableName + SpecCharEnum.COLON.getValue() + tmpTable + SpecCharEnum.COLON.getValue() + envName); + } - sqlList.addAll(saveStatisticAndSaveMySqlSentence(rule.getId(), ruleMetricMap, templateStatisticsAction, applicationId, statisticsRuleVariables - , createTime, count, runDate)); + tableEnvs.add(varEnvs); + } - return sqlList; + dataSourceMysqlConnect.remove(ruleDataSource.getId()); + LOGGER.info("Finished to replace env mappings with register temp table. Sql: " + midTableAction); + sqlList.add("Qualitis System Code Dividing Line"); + } + } + + return midTableAction; + } + + private List> decryptMysqlInfo(List> connParamMaps) throws UnExpectedRequestException { + if (CollectionUtils.isEmpty(connParamMaps)) { + return new ArrayList<>(); + } + List> connParamMapsReal = new ArrayList<>(connParamMaps.size()); + for (Map currentConnectParams : connParamMaps) { + if (DPM.equals(currentConnectParams.get("authType"))) { + connParamMapsReal.add(getUserNameAndPassword(currentConnectParams)); + } else { + String password = (String) currentConnectParams.get("password"); + if (currentConnectParams.get("needDecode") != null && "false".equals((String) currentConnectParams.get("needDecode"))) { + currentConnectParams.put("password", password); + } else { + currentConnectParams.put("password", CryptoUtils.decode(password)); + } + connParamMapsReal.add(currentConnectParams); + } + } + + return connParamMapsReal; + } + + private Map getUserNameAndPassword(Map connectParams) throws UnExpectedRequestException { +// String appId = (String) connectParams.get("appid"); +// String objectId = (String) connectParams.get("objectid"); +// String timestamp = (String) connectParams.get("timestamp"); +// +// String dk = (String) connectParams.get("dk"); +// String datasourceInf = LocalNetwork.getNetCardName(); +// AccountInfoObtainer obtainer = new AccountInfoObtainer(dpmConfig.getDatasourceServer(), dpmConfig.getDatasourcePort(), datasourceInf); +// obtainer.init(); +// try { +// AccountInfoSys accountInfoSys = obtainer.getAccountInfo_system(dk, timestamp, dpmConfig.getDatasourceSystemAppId() +// , dpmConfig.getDatasourceSystemAppKey(), appId, objectId); +// String userName = accountInfoSys.getName(); +// String passwordTest = accountInfoSys.getPassword(); +// +// connectParams.put("username", userName); +// connectParams.put("password", passwordTest); +// } catch (AccountInfoObtainException e) { +// LOGGER.error(e.getMessage(), e); +// throw new UnExpectedRequestException("{&FAILED_TO_GET_USERNAME_PASSWORD}", 500); +// } + return connectParams; } private String customMidTableActionUpdate(Rule rule, String templateMidTableAction, Date date, Map execParams, - StringBuffer partition, Map ruleMetricMap) throws UnExpectedRequestException { + StringBuilder partition, Map ruleMetricMap) throws UnExpectedRequestException { if (StringUtils.isNotBlank(rule.getCsId())) { templateMidTableAction = templateMidTableAction.replace(RuleConstraintEnum.CUSTOM_DATABASE_PREFIS.getValue().concat(SpecCharEnum.PERIOD.getValue()), ""); } if (StringUtils.isNotBlank(partition.toString())) { templateMidTableAction = templateMidTableAction.replace("${filter}", partition.toString()); - } else if (StringUtils.isNotBlank(rule.getWhereContent())){ + } else if (StringUtils.isNotBlank(rule.getWhereContent())) { templateMidTableAction = templateMidTableAction.replace("${filter}", rule.getWhereContent()); } - for (String key : execParams.keySet()) { - templateMidTableAction = templateMidTableAction.replace("${" + key + "}", execParams.get(key)); + for (Map.Entry entry : execParams.entrySet()) { + String key = entry.getKey(); + String value = entry.getValue(); + templateMidTableAction = templateMidTableAction.replace("${" + key + "}", value); } templateMidTableAction = DateExprReplaceUtil.replaceRunDate(date, templateMidTableAction); @@ -376,38 +1042,45 @@ private String customMidTableActionUpdate(Rule rule, String templateMidTableActi return templateMidTableAction; } - private String multiMidTableActionUpdate(Rule rule, String templateMidTableAction, Date date, Map filters) throws UnExpectedRequestException { + private String getMultiDatasourceFiltesAndUpdateMidTableAction(Rule rule, String templateMidTableAction, Date date, Map filters) throws UnExpectedRequestException { Set ruleDataSources = rule.getRuleDataSources(); - if (rule.getParentRule() != null) { - ruleDataSources = new HashSet<>(); - Set parentRuleDataSources = rule.getParentRule().getRuleDataSources(); - for (RuleDataSource ruleDataSource : parentRuleDataSources) { - RuleDataSource tmp = new RuleDataSource(ruleDataSource); - if (tmp.getDatasourceIndex() == 0) { - tmp.setDatasourceIndex(1); - } else { - tmp.setDatasourceIndex(0); - } - ruleDataSources.add(tmp); - } - } for (RuleDataSource ruleDataSource : ruleDataSources) { if (ruleDataSource.getDatasourceIndex().equals(0)) { String leftFilter = ruleDataSource.getFilter(); leftFilter = DateExprReplaceUtil.replaceFilter(date, leftFilter); templateMidTableAction = templateMidTableAction.replace(FILTER_LEFT_PLACEHOLDER, leftFilter); - filters.put("source_table", leftFilter); + filters.put("left_table", leftFilter); } else { String rightFilter = ruleDataSource.getFilter(); rightFilter = DateExprReplaceUtil.replaceFilter(date, rightFilter); templateMidTableAction = templateMidTableAction.replace(FILTER_RIGHT_PLACEHOLDER, rightFilter); - filters.put("target_table", rightFilter); + filters.put("right_table", rightFilter); } } + if (rule.getTemplate().getId().longValue() != MUL_SOURCE_ACCURACY_TEMPLATE_ID.longValue()) { + return templateMidTableAction; + } + List ruleDataSourceMappings = rule.getRuleDataSourceMappings().stream() + .filter(ruleDataSourceMapping -> ruleDataSourceMapping.getMappingType() != null && ruleDataSourceMapping.getMappingType().equals(MappingTypeEnum.MATCHING_FIELDS.getCode())).collect(Collectors.toList()); + + if (CollectionUtils.isNotEmpty(ruleDataSourceMappings)) { + StringBuilder compareColumns = new StringBuilder(); + int indexCol = 1; + for (RuleDataSourceMapping ruleDataSourceMapping : ruleDataSourceMappings) { + compareColumns.append(ruleDataSourceMapping.getLeftStatement()).append(" AS ").append("col" + indexCol).append(", "); + indexCol++; + compareColumns.append(ruleDataSourceMapping.getRightStatement()).append(" AS ").append("col" + indexCol).append(", "); + indexCol++; + } + + int index = templateMidTableAction.indexOf("CASE WHEN"); + + templateMidTableAction = new StringBuffer(templateMidTableAction).insert(index, compareColumns.toString()).toString(); + } return templateMidTableAction; } - private String fillPartitionWithRuleConfiguration(StringBuffer partition, Rule rule, String templateMidTableAction, List inputMetaRuleVariables) { + private String fillPartitionWithRuleConfiguration(StringBuilder partition, Rule rule, String templateMidTableAction, List inputMetaRuleVariables) { if (rule.getTemplate().getTemplateType().equals(RuleTemplateTypeEnum.SINGLE_SOURCE_TEMPLATE.getCode())) { partition.append(new ArrayList<>(rule.getRuleDataSources()).get(0).getFilter()); } else if (rule.getTemplate().getTemplateType().equals(RuleTemplateTypeEnum.CUSTOM.getCode())) { @@ -417,131 +1090,179 @@ private String fillPartitionWithRuleConfiguration(StringBuffer partition, Rule r } } else if (rule.getTemplate().getTemplateType().equals(RuleTemplateTypeEnum.MULTI_SOURCE_TEMPLATE.getCode())) { // Replace placeholder. - partition.delete(0, partition.length()); List filterVariable = inputMetaRuleVariables.stream().filter( - r -> r.getTemplateMidTableInputMeta().getInputType().equals(TemplateInputTypeEnum.CONDITION.getCode()) + r -> r.getTemplateMidTableInputMeta().getInputType().equals(TemplateInputTypeEnum.COMPARISON_RESULTS_FOR_FILTER.getCode()) ).collect(Collectors.toList()); - if (!filterVariable.isEmpty()) { - partition.append(filterVariable.get(0).getValue()); + if (CollectionUtils.isNotEmpty(filterVariable)) { + partition.append(filterVariable.iterator().next().getValue()); } } return templateMidTableAction; } - private List getCommonTransformSql(Map dbTableMap, StringBuffer mappings, int count, String filter, Map filters - , Map sourceConnect, Map targetConnect) { + private List getMultiSourceAccuracyfromSql(String midTableAction, Map dbTableMap, Map filters, String partOfVariableName + , Map sourceConnect, Map targetConnect, Map selectResult) { // Solve partition, value, hash value List transformSql = new ArrayList<>(); StringBuilder sourceSql = new StringBuilder(); StringBuilder targetSql = new StringBuilder(); + StringBuilder envName = new StringBuilder(); + sourceSql.append("select *").append(" from ") - .append(dbTableMap.get("source_db")).append(dbTableMap.get("source_table")) - .append(" where ").append(filters.get("source_table")); + .append(dbTableMap.get("left_database")).append(dbTableMap.get("left_table")) + .append(" where ").append(filters.get("left_table")); targetSql.append("select *").append(" from ") - .append(dbTableMap.get("target_db")).append(dbTableMap.get("target_table")) - .append(" where ").append(filters.get("target_table")); + .append(dbTableMap.get("right_database")).append(dbTableMap.get("right_table")) + .append(" where ").append(filters.get("right_table")); + if (sourceConnect != null && sourceConnect.size() > 0) { String host = (String) sourceConnect.get("host"); String port = (String) sourceConnect.get("port"); String user = (String) sourceConnect.get("username"); String pwd = (String) sourceConnect.get("password"); String dataType = (String) sourceConnect.get("dataType"); - String str = SPARK_MYSQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, sourceSql.toString()).replace(VARIABLE_NAME_PLACEHOLDER, "originalDF") - .replace("${JDBC_DRIVER}", JDBC_DRIVER.get(dataType)) - .replace("${MYSQL_IP}", host) - .replace("${MYSQL_PORT}", port) - .replace("${MYSQL_USER}", user) - .replace("${MYSQL_PASSWORD}", pwd); + String str = SPARK_MYSQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, sourceSql.toString()).replace(VARIABLE_NAME_PLACEHOLDER, "originalDFLeft_" + partOfVariableName) + .replace("${JDBC_DRIVER}", JDBC_DRIVER.get(dataType)) + .replace("${MYSQL_IP}", host) + .replace("${MYSQL_PORT}", port) + .replace("${MYSQL_USER}", user) + .replace("${MYSQL_PASSWORD}", pwd); transformSql.add(str); + envName.append("[").append((String) sourceConnect.get("envName")).append("]"); } else { - transformSql.add(SPARK_SQL_TEMPLATE.replace(VARIABLE_NAME_PLACEHOLDER, "originalDF").replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, sourceSql.toString())); + transformSql.add(SPARK_SQL_TEMPLATE.replace(VARIABLE_NAME_PLACEHOLDER, "originalDFLeft_" + partOfVariableName).replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, sourceSql.toString())); } if (targetConnect != null && targetConnect.size() > 0) { String host = (String) targetConnect.get("host"); String port = (String) targetConnect.get("port"); String user = (String) targetConnect.get("username"); String pwd = (String) targetConnect.get("password"); - String dataType = (String) sourceConnect.get("dataType"); - String str = SPARK_MYSQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, targetSql.toString()).replace(VARIABLE_NAME_PLACEHOLDER, "originalDF_2") - .replace("${JDBC_DRIVER}", JDBC_DRIVER.get(dataType)) - .replace("${MYSQL_IP}", host) - .replace("${MYSQL_PORT}", port) - .replace("${MYSQL_USER}", user) - .replace("${MYSQL_PASSWORD}", pwd); + String dataType = (String) targetConnect.get("dataType"); + String str = SPARK_MYSQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, targetSql.toString()).replace(VARIABLE_NAME_PLACEHOLDER, "originalDFRight_" + partOfVariableName) + .replace("${JDBC_DRIVER}", JDBC_DRIVER.get(dataType)) + .replace("${MYSQL_IP}", host) + .replace("${MYSQL_PORT}", port) + .replace("${MYSQL_USER}", user) + .replace("${MYSQL_PASSWORD}", pwd); transformSql.add(str); + envName.append("[").append((String) targetConnect.get("envName")).append("]"); } else { - transformSql.add(SPARK_SQL_TEMPLATE.replace(VARIABLE_NAME_PLACEHOLDER, "originalDF_2").replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, targetSql.toString())); + transformSql.add(SPARK_SQL_TEMPLATE.replace(VARIABLE_NAME_PLACEHOLDER, "originalDFRight_" + partOfVariableName).replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, targetSql.toString())); } - - transformSql.add("originalDF.registerTempTable(\"tmp1\")"); - transformSql.add("originalDF_2.registerTempTable(\"tmp2\")"); - String commonJoin = "SELECT tmp1.* FROM tmp1 LEFT JOIN tmp2 ON " + mappings.toString() + " WHERE " + filter; - String variableName1 = getVariableName(count); - String joinSql = "val " + variableName1 + " = spark.sql(\"" + commonJoin + "\")"; - + transformSql.add("originalDFLeft_" + partOfVariableName + ".registerTempTable(\"tmp1_" + partOfVariableName + dbTableMap.get("left_table") + "\")"); + transformSql.add("originalDFRight_" + partOfVariableName + ".registerTempTable(\"tmp2_" + partOfVariableName + dbTableMap.get("right_table") + "\")"); + String commonJoin = midTableAction + .replace(dbTableMap.get("left_database") + dbTableMap.get("left_table") + " ", "tmp1_" + partOfVariableName + dbTableMap.get("left_table") + " ") + .replace(dbTableMap.get("right_database") + dbTableMap.get("right_table") + " ", "tmp2_" + partOfVariableName + dbTableMap.get("right_table") + " "); + String variableFormer = getVariableNameByRule(OptTypeEnum.ORIGINAL_STATISTIC_DF.getMessage(), partOfVariableName); + String joinSql = "val " + variableFormer + " = spark.sql(\"" + commonJoin + "\")"; transformSql.add(joinSql); + String variableLatter = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName); + // Select compare_result = 0 + transformSql.add("val " + variableLatter + " = " + variableFormer + ".where(" + variableFormer + "(\"compare_result\") === 0)"); + selectResult.put(variableLatter, envName.toString()); return transformSql; } - private List getSpecialTransformSql(Map dbTableMap, int count, String filter, Map filters - , String columns, Map sourceConnect, Map targetConnect) { - // Solve partition fields. - List partitionFields = new ArrayList<>(); - if (StringUtils.isNotBlank(filter)) { - filter = filter.toLowerCase().trim(); - if (filter.contains(AND)) { - List subPartition = Arrays.asList(filter.split(AND)); - for (String sub : subPartition) { - String partitionField = sub.trim().substring(0, sub.indexOf("=")); - partitionFields.add(partitionField); - } - } else { - String partitionField = filter.substring(0, filter.indexOf("=")); - partitionFields.add(partitionField); - } - } + private List getSpecialTransformSql(Map dbTableMap, String partOfVariableName, String filter, Map filters, String columns + , Map sourceConnect, Map targetConnect, Integer contrastType, List leftCols, List rightCols, List complexCols, Map selectResult) { // Solve partition, value, hash value List transformSql = new ArrayList<>(); StringBuilder sourceSql = new StringBuilder(); StringBuilder targetSql = new StringBuilder(); - if (CollectionUtils.isNotEmpty(partitionFields)) { - if (StringUtils.isNotBlank(columns)) { - sourceSql.append("select ").append(columns); - targetSql.append("select ").append(columns); - } else { - sourceSql.append("select *"); - targetSql.append("select *"); - } - sourceSql.append(" from ").append(dbTableMap.get("source_db")).append(dbTableMap.get("source_table")).append(" where ").append(filter); - targetSql.append(" from ").append(dbTableMap.get("target_db")).append(dbTableMap.get("target_table")).append(" where ").append(filter); + StringBuilder envName = new StringBuilder(); + handleSourceAndTargetSql(dbTableMap, filters, columns, sourceConnect, targetConnect, transformSql, sourceSql, targetSql, leftCols, rightCols, complexCols, envName, partOfVariableName); + // Full line to MD5 with dataframe api transformation. + StringBuilder tmpRegisterTableLeft = new StringBuilder(); + StringBuilder tmpRegisterTableRight = new StringBuilder(); + fuleLineToHashLine(transformSql, partOfVariableName, tmpRegisterTableLeft, tmpRegisterTableRight); + String originalVariableName = getVariableNameByRule(OptTypeEnum.ORIGINAL_STATISTIC_DF.getMessage(), partOfVariableName); + String joinSql = "val " + originalVariableName + " = spark.sql(\"SELECT qulaitis_left_tmp.qualitis_full_line_hash_value as left_full_hash_line, qulaitis_left_tmp.qualitis_mul_db_accuracy_num as left_full_line_num, qulaitis_right_tmp.qualitis_full_line_hash_value as right_full_hash_line, qulaitis_right_tmp.qualitis_mul_db_accuracy_num as right_full_line_num FROM (SELECT qualitis_full_line_hash_value, count(1) as qualitis_mul_db_accuracy_num FROM " + tmpRegisterTableLeft.toString() + " WHERE true group by qualitis_full_line_hash_value) qulaitis_left_tmp ${contrast_type} (SELECT qualitis_full_line_hash_value, count(1) as qualitis_mul_db_accuracy_num FROM " + tmpRegisterTableRight.toString() + " WHERE true group by qualitis_full_line_hash_value) qulaitis_right_tmp ON (qulaitis_left_tmp.qualitis_full_line_hash_value = qulaitis_right_tmp.qualitis_full_line_hash_value AND qulaitis_left_tmp.qualitis_mul_db_accuracy_num = qulaitis_right_tmp.qualitis_mul_db_accuracy_num) WHERE (qulaitis_right_tmp.qualitis_full_line_hash_value is null AND qulaitis_right_tmp.qualitis_mul_db_accuracy_num is null) OR (qulaitis_left_tmp.qualitis_full_line_hash_value is null AND qulaitis_left_tmp.qualitis_mul_db_accuracy_num is null) ${outer_filter}\")"; + joinSql = joinSql.replace("${contrast_type}", ContrastTypeEnum.getJoinType(contrastType)); + if (StringUtils.isNotEmpty(filter)) { + joinSql = joinSql.replace("${outer_filter}", "AND (" + filter + ")"); + } else { + joinSql = joinSql.replace("${outer_filter}", ""); + } + transformSql.add(joinSql); + String statisticVariableName = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName); + if (fpsConfig.getLightweightQuery()) { + String leftVariableName = getVariableNameByRule(OptTypeEnum.LEFT_JOIN_STATISTIC_DF.getMessage(), partOfVariableName); + String rightVariableName = getVariableNameByRule(OptTypeEnum.RIGHT_JOIN_STATISTIC_DF.getMessage(), partOfVariableName); + + transformSql.add(originalVariableName + ".registerTempTable(\"md5_table_total_" + partOfVariableName + "\")"); + String joinSqlWithLeft = "val " + leftVariableName + " = spark.sql(\"\"\"SELECT \"left\" as source, " + tmpRegisterTableLeft.toString() + ".qualitis_full_line_value as full_line, md5_table_total_" + partOfVariableName + ".left_full_line_num FROM " + tmpRegisterTableLeft.toString() + " full outer join md5_table_total_" + partOfVariableName + " on " + tmpRegisterTableLeft.toString() + ".qualitis_full_line_hash_value = md5_table_total_" + partOfVariableName + ".left_full_hash_line where " + tmpRegisterTableLeft.toString() + ".qualitis_full_line_hash_value is not null and md5_table_total_" + partOfVariableName + ".left_full_hash_line is not null\"\"\")"; + String joinSqlWithRight = "val " + rightVariableName + " = spark.sql(\"\"\"SELECT \"right\" as source, " + tmpRegisterTableRight.toString() + ".qualitis_full_line_value as full_line, md5_table_total_" + partOfVariableName + ".right_full_line_num FROM " + tmpRegisterTableRight.toString() + " full outer join md5_table_total_" + partOfVariableName + " on " + tmpRegisterTableRight.toString() + ".qualitis_full_line_hash_value = md5_table_total_" + partOfVariableName + ".right_full_hash_line where " + tmpRegisterTableRight.toString() + ".qualitis_full_line_hash_value is not null and md5_table_total_" + partOfVariableName + ".right_full_hash_line is not null\"\"\")"; + + transformSql.add(joinSqlWithLeft); + transformSql.add(joinSqlWithRight); + + transformSql.add("val " + statisticVariableName + " = " + leftVariableName + ".union(" + rightVariableName + ")"); } else { - if (StringUtils.isNotBlank(columns)) { - sourceSql.append("select ").append(columns); - targetSql.append("select ").append(columns); + transformSql.add("val " + statisticVariableName + " = " + originalVariableName); + } + if (StringUtils.isNotEmpty(envName)) { + selectResult.put(statisticVariableName, envName.toString()); + } + + return transformSql; + } + + private void handleSourceAndTargetSql(Map dbTableMap, Map filters, String columns, Map sourceConnect, Map targetConnect + , List transformSql, StringBuilder sourceSql, StringBuilder targetSql, List leftCols, List rightCols, List complexCols, StringBuilder envName, String partOfVariableName) { + if (StringUtils.isNotBlank(columns)) { + sourceSql.append("select ").append(columns); + targetSql.append("select ").append(columns); + } else { + if (taskDataSourceConfig.getHiveSortUdfOpen() && CollectionUtils.isNotEmpty(complexCols)) { + List leftColsReal = new ArrayList<>(leftCols.size()); + List rightColsReal = new ArrayList<>(rightCols.size()); + sourceSql.append("select "); + targetSql.append("select "); + for (String col : leftCols) { + if (complexCols.contains(col)) { + leftColsReal.add(taskDataSourceConfig.getHiveSortUdf() + "(" + col + ")"); + } else { + leftColsReal.add(col); + } + } + sourceSql.append(String.join(",", leftColsReal)); + + for (String col : rightCols) { + if (complexCols.contains(col)) { + rightColsReal.add(taskDataSourceConfig.getHiveSortUdf() + "(" + col + ")"); + } else { + rightColsReal.add(col); + } + } + targetSql.append(String.join(",", rightColsReal)); } else { sourceSql.append("select *"); targetSql.append("select *"); } - sourceSql.append(" from ").append(dbTableMap.get("source_db")).append(dbTableMap.get("source_table")).append(" where ").append(filters.get("source_table")); - targetSql.append(" from ").append(dbTableMap.get("target_db")).append(dbTableMap.get("target_table")).append(" where ").append(filters.get("target_table")); } + sourceSql.append(" from ").append(dbTableMap.get("left_database")).append(dbTableMap.get("left_table")).append(" where ").append(filters.get("left_table")); + targetSql.append(" from ").append(dbTableMap.get("right_database")).append(dbTableMap.get("right_table")).append(" where ").append(filters.get("right_table")); + if (sourceConnect != null && sourceConnect.size() > 0) { String host = (String) sourceConnect.get("host"); String port = (String) sourceConnect.get("port"); String user = (String) sourceConnect.get("username"); String pwd = (String) sourceConnect.get("password"); String dataType = (String) sourceConnect.get("dataType"); - String str = SPARK_MYSQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, sourceSql.toString()).replace(VARIABLE_NAME_PLACEHOLDER, "originalDF") - .replace("${JDBC_DRIVER}", JDBC_DRIVER.get(dataType)) - .replace("${MYSQL_IP}", host) - .replace("${MYSQL_PORT}", port) - .replace("${MYSQL_USER}", user) - .replace("${MYSQL_PASSWORD}", pwd); + String str = SPARK_MYSQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, sourceSql.toString()).replace(VARIABLE_NAME_PLACEHOLDER, "originalDFLeft_" + partOfVariableName) + .replace("${JDBC_DRIVER}", JDBC_DRIVER.get(dataType)) + .replace("${MYSQL_IP}", host) + .replace("${MYSQL_PORT}", port) + .replace("${MYSQL_USER}", user) + .replace("${MYSQL_PASSWORD}", pwd); transformSql.add(str); + + envName.append("[").append((String) sourceConnect.get("envName")).append("]"); } else { - transformSql.add(SPARK_SQL_TEMPLATE.replace(VARIABLE_NAME_PLACEHOLDER, "originalDF").replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, sourceSql.toString())); + transformSql.add(SPARK_SQL_TEMPLATE.replace(VARIABLE_NAME_PLACEHOLDER, "originalDFLeft_" + partOfVariableName).replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, sourceSql.toString())); } if (targetConnect != null && targetConnect.size() > 0) { String host = (String) targetConnect.get("host"); @@ -549,67 +1270,37 @@ private List getSpecialTransformSql(Map dbTableMap, int String user = (String) targetConnect.get("username"); String pwd = (String) targetConnect.get("password"); String dataType = (String) targetConnect.get("dataType"); - String str = SPARK_MYSQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, targetSql.toString()).replace(VARIABLE_NAME_PLACEHOLDER, "originalDF_2") - .replace("${JDBC_DRIVER}", JDBC_DRIVER.get(dataType)) - .replace("${MYSQL_IP}", host) - .replace("${MYSQL_PORT}", port) - .replace("${MYSQL_USER}", user) - .replace("${MYSQL_PASSWORD}", pwd); + String str = SPARK_MYSQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, targetSql.toString()).replace(VARIABLE_NAME_PLACEHOLDER, "originalDFRight_" + partOfVariableName) + .replace("${JDBC_DRIVER}", JDBC_DRIVER.get(dataType)) + .replace("${MYSQL_IP}", host) + .replace("${MYSQL_PORT}", port) + .replace("${MYSQL_USER}", user) + .replace("${MYSQL_PASSWORD}", pwd); transformSql.add(str); + envName.append("[").append((String) targetConnect.get("envName")).append("]"); } else { - transformSql.add(SPARK_SQL_TEMPLATE.replace(VARIABLE_NAME_PLACEHOLDER, "originalDF_2").replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, targetSql.toString())); - } - // Full line to MD5 with dataframe api transformation. - fuleLineToHashLine(transformSql, partitionFields); - - String variableName1 = getVariableName(count); - if (optimizationConfig.getLightweightQuery()) { - count ++; - String variableName2 = getVariableName(count); - count ++; - String variableName3 = getVariableName(count); - count ++; - String variableName4 = getVariableName(count); - - String joinSql = "val " + variableName1 + " = spark.sql(\"SELECT qualitis_tmp1.qualitis_full_line_hash_value as left_full_hash_line, qualitis_tmp1.qualitis_mul_db_accuracy_num as left_full_line_num, qualitis_tmp2.qualitis_full_line_hash_value as right_full_hash_line, qualitis_tmp2.qualitis_mul_db_accuracy_num as right_full_line_num FROM (SELECT qualitis_full_line_hash_value, count(1) as qualitis_mul_db_accuracy_num FROM md5_table_3 WHERE true group by qualitis_full_line_hash_value) qualitis_tmp1 FULL OUTER JOIN (SELECT qualitis_full_line_hash_value, qualitis_full_line_hash_value, count(1) as qualitis_mul_db_accuracy_num FROM md5_table_4 WHERE true group by qualitis_full_line_hash_value) qualitis_tmp2 ON (qualitis_tmp1.qualitis_full_line_hash_value = qualitis_tmp2.qualitis_full_line_hash_value AND qualitis_tmp1.qualitis_mul_db_accuracy_num = qualitis_tmp2.qualitis_mul_db_accuracy_num) WHERE ( NOT (qualitis_tmp1.qualitis_full_line_hash_value is null AND qualitis_tmp1.qualitis_mul_db_accuracy_num is null) AND (qualitis_tmp2.qualitis_full_line_hash_value is null AND qualitis_tmp2.qualitis_mul_db_accuracy_num is null)) OR ( NOT (qualitis_tmp2.qualitis_full_line_hash_value is null AND qualitis_tmp2.qualitis_mul_db_accuracy_num is null) AND (qualitis_tmp1.qualitis_full_line_hash_value is null AND qualitis_tmp1.qualitis_mul_db_accuracy_num is null))\")"; - transformSql.add(joinSql); - transformSql.add(variableName1 + ".registerTempTable(\"md5_table_5\")"); - - String joinSqlWithLeft = "val " + variableName2 + " = spark.sql(\"\"\"SELECT \"left\" as source, md5_table_3.qualitis_full_line_value as full_line, md5_table_5.left_full_line_num FROM md5_table_3 full outer join md5_table_5 on md5_table_3.qualitis_full_line_hash_value = md5_table_5.left_full_hash_line where md5_table_3.qualitis_full_line_hash_value is not null and md5_table_5.left_full_hash_line is not null\"\"\")"; - String joinSqlWithRight = "val " + variableName3 + " = spark.sql(\"\"\"SELECT \"right\" as source, md5_table_4.qualitis_full_line_value as full_line, md5_table_5.right_full_line_num FROM md5_table_4 full outer join md5_table_5 on md5_table_4.qualitis_full_line_hash_value = md5_table_5.right_full_hash_line where md5_table_4.qualitis_full_line_hash_value is not null and md5_table_5.right_full_hash_line is not null\"\"\")"; - - transformSql.add(joinSqlWithLeft); - transformSql.add(joinSqlWithRight); - transformSql.add("val " + variableName4 + "=" + variableName2 + ".union(" + variableName3 + ")"); - } else { - String joinSql = "val " + variableName1 + " = spark.sql(\"SELECT qualitis_tmp1.qualitis_full_line_value as left_full_line, qualitis_tmp1.qualitis_mul_db_accuracy_num as left_full_line_num, qualitis_tmp2.qualitis_full_line_value as right_full_line, qualitis_tmp2.qualitis_mul_db_accuracy_num as right_full_line_num FROM (SELECT qualitis_full_line_value, qualitis_full_line_hash_value, count(1) as qualitis_mul_db_accuracy_num FROM md5_table_3 WHERE true group by qualitis_full_line_value, qualitis_full_line_hash_value) qualitis_tmp1 FULL OUTER JOIN (SELECT qualitis_full_line_value, qualitis_full_line_hash_value, count(1) as qualitis_mul_db_accuracy_num FROM md5_table_4 WHERE true group by qualitis_full_line_value, qualitis_full_line_hash_value) qualitis_tmp2 ON (qualitis_tmp1.qualitis_full_line_hash_value = qualitis_tmp2.qualitis_full_line_hash_value AND qualitis_tmp1.qualitis_mul_db_accuracy_num = qualitis_tmp2.qualitis_mul_db_accuracy_num) WHERE ( NOT (qualitis_tmp1.qualitis_full_line_hash_value is null AND qualitis_tmp1.qualitis_mul_db_accuracy_num is null) AND (qualitis_tmp2.qualitis_full_line_hash_value is null AND qualitis_tmp2.qualitis_mul_db_accuracy_num is null)) OR ( NOT (qualitis_tmp2.qualitis_full_line_hash_value is null AND qualitis_tmp2.qualitis_mul_db_accuracy_num is null) AND (qualitis_tmp1.qualitis_full_line_hash_value is null AND qualitis_tmp1.qualitis_mul_db_accuracy_num is null))\")"; - - transformSql.add(joinSql); + transformSql.add(SPARK_SQL_TEMPLATE.replace(VARIABLE_NAME_PLACEHOLDER, "originalDFRight_" + partOfVariableName).replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, targetSql.toString())); } - - return transformSql; } - private void fuleLineToHashLine(List transformSql, List partitionFields) { - transformSql.add("val fillNullDF = originalDF.na.fill(UUID)"); - transformSql.add("val qualitis_names = fillNullDF.schema.fieldNames"); - transformSql.add("val fileNullWithFullLineWithHashDF = fillNullDF.withColumn(\"qualitis_full_line_value\", to_json(struct($\"*\"))).withColumn(\"qualitis_full_line_hash_value\", md5(to_json(struct($\"*\"))))"); - transformSql.add("val qualitis_names_buffer = qualitis_names.toBuffer"); + private void fuleLineToHashLine(List transformSql, String partOfVariableName, StringBuilder tmpRegisterTableLeft, StringBuilder tmpRegisterTableRight) { + transformSql.add("val fillNullDFLeft_" + partOfVariableName + " = originalDFLeft_" + partOfVariableName + ".na.fill(UUID)"); + transformSql.add("val qualitis_names_left_" + partOfVariableName + " = fillNullDFLeft_" + partOfVariableName + ".schema.fieldNames"); + transformSql.add("val fillNullWithFullLineWithHashDF_left_" + partOfVariableName + " = fillNullDFLeft_" + partOfVariableName + ".withColumn(\"qualitis_full_line_value\", to_json(struct($\"*\"))).withColumn(\"qualitis_full_line_hash_value\", md5(to_json(struct($\"*\"))))"); + transformSql.add("val qualitis_names_left_" + partOfVariableName + "_buffer = qualitis_names_left_" + partOfVariableName + ".toBuffer"); - transformSql.add("val fillNullDF_2 = originalDF_2.na.fill(UUID)"); - transformSql.add("val qualitis_names_2 = fillNullDF_2.schema.fieldNames"); - transformSql.add("val fileNullWithFullLineWithHashDF_2 = fillNullDF_2.withColumn(\"qualitis_full_line_value\", to_json(struct($\"*\"))).withColumn(\"qualitis_full_line_hash_value\", md5(to_json(struct($\"*\"))))"); - transformSql.add("val qualitis_names_buffer_2 = qualitis_names_2.toBuffer"); + transformSql.add("val fillNullDFRight_" + partOfVariableName + " = originalDFRight_" + partOfVariableName + ".na.fill(UUID)"); + transformSql.add("val qualitis_names_right_" + partOfVariableName + " = fillNullDFRight_" + partOfVariableName + ".schema.fieldNames"); + transformSql.add("val fillNullWithFullLineWithHashDF_right_" + partOfVariableName + " = fillNullDFRight_" + partOfVariableName + ".withColumn(\"qualitis_full_line_value\", to_json(struct($\"*\"))).withColumn(\"qualitis_full_line_hash_value\", md5(to_json(struct($\"*\"))))"); + transformSql.add("val qualitis_names_right_" + partOfVariableName + "buffer = qualitis_names_right_" + partOfVariableName + ".toBuffer"); - for (String partitionField : partitionFields) { - transformSql.add("qualitis_names_buffer -= \"" + partitionField + "\""); - transformSql.add("qualitis_names_buffer_2 -= \"" + partitionField + "\""); - } - transformSql.add("val finalDF = fileNullWithFullLineWithHashDF.drop(qualitis_names_buffer:_*)"); - transformSql.add("val finalDF_2 = fileNullWithFullLineWithHashDF_2.drop(qualitis_names_buffer_2:_*)"); + transformSql.add("val finalDF_left_" + partOfVariableName + " = fillNullWithFullLineWithHashDF_left_" + partOfVariableName + ".drop(qualitis_names_left_" + partOfVariableName + ":_*)"); + transformSql.add("val finalDF_right_" + partOfVariableName + " = fillNullWithFullLineWithHashDF_right_" + partOfVariableName + ".drop(qualitis_names_right_" + partOfVariableName + ":_*)"); - transformSql.add("finalDF.registerTempTable(\"md5_table_3\")"); - transformSql.add("finalDF_2.registerTempTable(\"md5_table_4\")"); + tmpRegisterTableLeft.append("md5_table_left_" + partOfVariableName); + tmpRegisterTableRight.append("md5_table_right_" + partOfVariableName); + transformSql.add("finalDF_left_" + partOfVariableName + ".registerTempTable(\"" + tmpRegisterTableLeft.toString() + "\")"); + transformSql.add("finalDF_right_" + partOfVariableName + ".registerTempTable(\"" + tmpRegisterTableRight.toString() + "\")"); } private List getImportSql() { @@ -619,72 +1310,225 @@ private List getImportSql() { return imports; } - private List saveStatisticAndSaveMySqlSentence(Long ruleId, Map ruleMetricIds, - Set templateStatisticsInputMetas, String applicationId, List ruleVariables, - String createTime, Integer count, String runDate) throws RuleVariableNotSupportException, RuleVariableNotFoundException { - return abstractTranslator.persistenceTranslate(ruleId, ruleMetricIds, templateStatisticsInputMetas, applicationId, ruleVariables, createTime - , count, runDate); + private List saveStatisticAndSaveMySqlSentence(String workFlowVersion, Long ruleId, Map ruleMetricIds + , Set templateStatisticsInputMetas, String applicationId, List ruleVariables, String createTime + , String partOfVariableName, String runDate, String user, StringBuilder realColumn, boolean enumListNewValue, boolean numRangeNewValue, + Map selectResult, boolean unionAllForSaveResult) throws RuleVariableNotSupportException, RuleVariableNotFoundException { + return abstractTranslator.persistenceTranslate(workFlowVersion, ruleId, ruleMetricIds, templateStatisticsInputMetas, applicationId, ruleVariables, createTime + , partOfVariableName, runDate, user, realColumn, enumListNewValue, numRangeNewValue, selectResult, unionAllForSaveResult); } /** * Generate scala code of select statement and save into hive database + * * @param sql * @param saveTableName - * @param template - * @param count - * @param connParams + * @param rule + * @param partOfVariableName + * @param connParamMaps * @param runDate + * @param selectResult + * @param midTableReUse + * @param unionAllForSaveResult + * @param filterFields + * @param tableEnvs + * @param shareConnect + * @param shareFromPart * @return */ - private List generateSparkSqlAndSaveSentence(String sql, String saveTableName, Template template, Integer count, Map connParams, String runDate) { - List sparkSqlList = new ArrayList<>(); + private List generateSparkSqlAndSaveSentence(String sql, String saveTableName, Rule rule, String partOfVariableName, List> connParamMaps + , String runDate, Map selectResult, boolean midTableReUse, boolean unionAllForSaveResult, String filterFields, List> tableEnvs, boolean shareConnect, String shareFromPart) { String sparkSqlSentence; - if (connParams == null) { - sparkSqlSentence = getSparkSqlSentence(sql, count); + List sparkSqlList = new ArrayList<>(); + boolean linePrimaryRepeat = QualitisConstants.EXPECT_LINES_NOT_REPEAT_ID.equals(rule.getTemplate().getId()) || QualitisConstants.EXPECT_DATA_NOT_REPEAT_ID.equals(rule.getTemplate().getId()); + if (CollectionUtils.isEmpty(connParamMaps)) { + + if (CollectionUtils.isNotEmpty(tableEnvs)) { + List> sqlReplaceStrLists = QualitisCollectionUtils.getDescartes(tableEnvs); + + for (List subList : sqlReplaceStrLists) { + StringBuilder envName = new StringBuilder(); + for (String replaceStr : subList) { + String[] subStrs = replaceStr.split(SpecCharEnum.COLON.getValue()); + envName.append("[").append(subStrs[2]).append("]"); + String registerTable = subStrs[1]; + String realTable = subStrs[0]; + + sql = sql.replace(realTable, registerTable); + } + String partOfVariableNameWithEnv = partOfVariableName + envName.toString().replace("[", "").replace("]", ""); + sparkSqlList.add("// 生成规则 " + partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + " 的校验查询代码"); + sparkSqlSentence = getSparkSqlSentence(sql, partOfVariableNameWithEnv, "", "", ""); + sparkSqlList.add(sparkSqlSentence); + + String variableFormer = getVariableNameByRule(partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[0], partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1]); + String variableLatter = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1]); + formatSchema(sparkSqlList, partOfVariableName, variableFormer, variableLatter); + + selectResult.put(variableLatter, envName.toString()); + } + String lastVariable = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + "Last"); + unionAllSaveResult(lastVariable, selectResult, sparkSqlList, unionAllForSaveResult); + } else { + if (linePrimaryRepeat) { + sparkSqlList.add("val UUID = java.util.UUID.randomUUID.toString"); + } + sparkSqlSentence = getSparkSqlSentence(sql, partOfVariableName, filterFields, shareFromPart, ""); + LOGGER.info("Succeed to generate spark sql. sentence: {}", sparkSqlSentence); + sparkSqlList.add("// 生成规则 " + partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + " 的校验查询代码"); + sparkSqlList.add(sparkSqlSentence); + + if (linePrimaryRepeat) { + handleLinePrimaryRepeat(sparkSqlList, partOfVariableName); + } + + String variableFormer = getVariableNameByRule(partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[0], partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1]); + String variableLatter = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1]); + formatSchema(sparkSqlList, partOfVariableName, variableFormer, variableLatter); + + if (Boolean.TRUE.equals(rule.getTemplate().getSaveMidTable())) { + sparkSqlList.addAll(getSaveMidTableSentenceSettings()); + sparkSqlList.addAll(getSaveMidTableSentence(saveTableName, partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1], runDate, midTableReUse)); + LOGGER.info("Succeed to generate spark sql. sentence."); + } + } + + return sparkSqlList; } else { - sparkSqlSentence = getSparkSqlSententceWithMysqlConnParams(sql, count, connParams); + // Repeat with envs. When polymerization, repeat one more time. + selectResult.putAll(getSparkSqlSententceWithMysqlConnParams(sql, partOfVariableName, connParamMaps, sparkSqlList, linePrimaryRepeat, rule.getTemplate().getSaveMidTable(), saveTableName, runDate, midTableReUse, unionAllForSaveResult, filterFields, shareConnect, shareFromPart)); } + return sparkSqlList; + } + + private void handleLinePrimaryRepeat(List sparkSqlList, Integer count) { + sparkSqlList.add("val fillNullDF_" + count + " = " + getVariableName(count) + ".na.fill(UUID)"); + sparkSqlList.add("val fillNullWithFullLineWithHashDF_" + count + " = fillNullDF_" + count + ".withColumn(\"qualitis_full_line_value\", to_json(struct($\"*\"))).withColumn(\"md5\", md5(to_json(struct($\"*\"))))"); + sparkSqlList.add("fillNullWithFullLineWithHashDF_" + count + ".registerTempTable(\"tmp_table_" + count + "\")"); + sparkSqlList.add("val " + getVariableName(count) + " = spark.sql(\"select md5, count(1) as md5_count from tmp_table_" + count + " group by md5 having count(*) > 1\")"); + } - sparkSqlList.add(sparkSqlSentence); - List midTableInputNames = template.getTemplateMidTableInputMetas().stream().map(TemplateMidTableInputMeta::getName).collect(Collectors.toList()); + private void handleLinePrimaryRepeat(List sparkSqlList, String fullName) { + String suffix = fullName.split(SpecCharEnum.EQUAL.getValue())[1]; + sparkSqlList.add("val fillNullDF_" + suffix + " = " + getVariableNameByRule(fullName.split(SpecCharEnum.EQUAL.getValue())[0], suffix) + ".na.fill(UUID)"); + sparkSqlList.add("val fillNullWithFullLineWithHashDF_" + suffix + " = fillNullDF_" + suffix + ".withColumn(\"qualitis_full_line_value\", to_json(struct($\"*\"))).withColumn(\"md5\", md5(to_json(struct($\"*\"))))"); + sparkSqlList.add("fillNullWithFullLineWithHashDF_" + suffix + ".registerTempTable(\"tmp_table_" + suffix + "\")"); + sparkSqlList.add("val " + getVariableNameByRule(fullName.split(SpecCharEnum.EQUAL.getValue())[0], suffix) + " = spark.sql(\"select md5, count(1) as md5_count from tmp_table_" + suffix + " group by md5 having count(*) > 1\")"); + } - boolean linePrimaryRepeat = CollectionUtils.isNotEmpty(midTableInputNames) && (midTableInputNames.contains(EN_LINE_PRIMARY_REPEAT) || midTableInputNames.contains(CN_LINE_PRIMARY_REPEAT) || midTableInputNames.contains(MESSAGE_LINE_PRIMARY_REPEAT)); - if (linePrimaryRepeat) { - sparkSqlList.add("val fillNullDF_" + count + " = " + getVariableName(count) + ".na.fill(UUID)"); - sparkSqlList.add("val fileNullWithFullLineWithHashDF_" + count + " = fillNullDF_" + count + ".withColumn(\"qualitis_full_line_value\", to_json(struct($\"*\"))).withColumn(\"md5\", md5(to_json(struct($\"*\"))))"); - sparkSqlList.add("fileNullWithFullLineWithHashDF_" + count + ".registerTempTable(\"tmp_table_" + count + "\")"); - sparkSqlList.add("val " + getVariableName(count) + " = spark.sql(\"select md5, count(1) as md5_count from tmp_table_" + count + " group by md5 having count(*) > 1\")"); + private Map getSparkSqlSententceWithMysqlConnParams(String sql, String partOfVariableName, List> connParamMaps, List sparkSqlList + , boolean linePrimaryRepeat, Boolean saveMidTable, String saveTableName, String runDate, boolean midTableReUse, boolean unionAllForSaveResult, String filterFields, boolean shareConnect, String shareFromPart) { + Map selectResult = new HashMap<>(connParamMaps.size()); + for (Map connParams : connParamMaps) { + String envName = (String) connParams.get("envName"); + if (StringUtils.isEmpty(envName)) { + continue; + } + String tmpVariableName = partOfVariableName + envName; + String variableFormer = getVariableNameByRule(tmpVariableName.split(SpecCharEnum.EQUAL.getValue())[0], tmpVariableName.split(SpecCharEnum.EQUAL.getValue())[1]); + + if (shareConnect) { + sparkSqlList.add("// 生成规则 " + partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + ",在环境 " + envName + " 的校验查询代码"); + sparkSqlList.add(getSparkSqlSentence(sql, tmpVariableName, filterFields, shareFromPart, SpecCharEnum.BOTTOM_BAR.getValue() + envName)); + } else { + + String tmp = sql.replace("\"", "\\\""); + String host = (String) connParams.get("host"); + String port = (String) connParams.get("port"); + String pwd = (String) connParams.get("password"); + String user = (String) connParams.get("username"); + String dataType = (String) connParams.get("dataType"); + String str = SPARK_MYSQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, tmp) + .replace("${JDBC_DRIVER}", JDBC_DRIVER.get(dataType)) + .replace("${MYSQL_IP}", host) + .replace("${MYSQL_PORT}", port) + .replace("${MYSQL_USER}", user) + .replace("${MYSQL_PASSWORD}", pwd); + sparkSqlList.add("// 生成规则 " + partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + " 的校验查询代码"); + sparkSqlList.add(str.replace(VARIABLE_NAME_PLACEHOLDER, variableFormer)); + } + if (linePrimaryRepeat) { + handleLinePrimaryRepeat(sparkSqlList, tmpVariableName); + } + + String variableLatter = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), tmpVariableName.split(SpecCharEnum.EQUAL.getValue())[1]); + selectResult.put(variableLatter, (String) connParams.get("envName")); + formatSchema(sparkSqlList, tmpVariableName, variableFormer, variableLatter); } + String lastVariable = getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName.split(SpecCharEnum.EQUAL.getValue())[1] + "Last"); + unionAllSaveResult(lastVariable, selectResult, sparkSqlList, unionAllForSaveResult); + LOGGER.info("Succeed to generate spark sql. sentence."); - LOGGER.info("Succeed to generate spark sql. sentence: {}", sparkSqlSentence); - // Fix bug in the workflow between widget node and qualitis node. - String variableFormer = getVariableName(count); - count ++; - String variableLatter = getVariableName(count); - formatSchema(sparkSqlList, variableFormer, variableLatter); - // Fix bug end. - if (template.getSaveMidTable()) { + if (saveMidTable) { sparkSqlList.addAll(getSaveMidTableSentenceSettings()); - sparkSqlList.addAll(getSaveMidTableSentence(saveTableName, count, runDate)); - LOGGER.info("Succeed to generate spark sql. sentence."); + sparkSqlList.addAll(getSaveMidTableSentence(saveTableName, runDate, midTableReUse, selectResult)); } - return sparkSqlList; + + return selectResult; } - private String getSparkSqlSententceWithMysqlConnParams(String sql, Integer count, Map connParams) { - sql = sql.replace("\"", "\\\""); - String host = (String) connParams.get("host"); - String port = (String) connParams.get("port"); - String user = (String) connParams.get("username"); - String pwd = (String) connParams.get("password"); - String dataType = (String) connParams.get("dataType"); - String str = SPARK_MYSQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, sql) - .replace("${JDBC_DRIVER}", JDBC_DRIVER.get(dataType)) - .replace("${MYSQL_IP}", host) - .replace("${MYSQL_PORT}", port) - .replace("${MYSQL_USER}", user) - .replace("${MYSQL_PASSWORD}", pwd); - return str.replace(VARIABLE_NAME_PLACEHOLDER, getVariableName(count)); + private void unionAllSaveResult(String lastVariable, Map selectResult, List sparkSqlList, boolean unionAllForSaveResult) { + if (selectResult.size() > 1 && unionAllForSaveResult) { + StringBuilder saveUnion = new StringBuilder(); + boolean firstVar = true; + for (String varName : selectResult.keySet()) { + if (firstVar) { + saveUnion.append("val ").append(lastVariable).append(" = ").append(varName); + firstVar = false; + } else { + saveUnion.append(".unionAll(").append(varName).append(")"); + } + } + + selectResult.clear(); + selectResult.put(lastVariable, QualitisConstants.UNION_ALL); + sparkSqlList.add(saveUnion.toString()); + int index = sparkSqlList.indexOf("Qualitis System Code Dividing Line"); + while (index != -1) { + sparkSqlList.remove(index); + index = sparkSqlList.indexOf("Qualitis System Code Dividing Line"); + } + } else { + sparkSqlList.add("Qualitis System Code Dividing Line"); + } + } + + private List getSaveMidTableSentence(String saveMidTableName, String runDate, boolean midTableReUse, Map selectResult) { + SimpleDateFormat format = new SimpleDateFormat("yyyyMMdd"); + Calendar calendar = Calendar.getInstance(); + calendar.add(Calendar.DATE, -7); + Date monday = calendar.getTime(); + Date date = new Date(); + + String beforeDay = format.format(monday); + + List saveSqls = new ArrayList<>(); + + + if (!midTableReUse) { + saveSqls.add("spark.sql(\"DROP TABLE IF EXISTS " + saveMidTableName + "\")"); + } + + saveSqls.add(IF_EXIST.replace(SAVE_MID_TABLE_NAME_PLACEHOLDER, saveMidTableName)); + + // Delete 7 days before partition + String getPartitions = "val partition_list_" + saveMidTableName.replace(SpecCharEnum.PERIOD_NO_ESCAPE.getValue(), SpecCharEnum.BOTTOM_BAR.getValue()) + " = spark.sql(\"select qualitis_partition_key from " + saveMidTableName + " where (qualitis_partition_key < " + beforeDay + ")\").map(f=>f.getString(0)).collect.toList"; + saveSqls.add(getPartitions); + String foreachDrop = "partition_list_" + saveMidTableName.replace(SpecCharEnum.PERIOD_NO_ESCAPE.getValue(), SpecCharEnum.BOTTOM_BAR.getValue()) + ".foreach(f => spark.sql(\"alter table " + saveMidTableName + " drop if exists partition (qualitis_partition_key=\" + f + \")\"))"; + saveSqls.add(foreachDrop); + for (Map.Entry entry : selectResult.entrySet()) { + String key = entry.getKey(); + String value = entry.getValue(); + saveSqls.addAll(parsefirstHalf(SAVE_MID_TABLE_SENTENCE_TEMPLATE_INSERT_OVERWRITE_PARTITION_WITH_ENV, value, key, saveMidTableName, runDate, date, format)); + } + saveSqls.add(ELSE_EXIST); + for (Map.Entry entry : selectResult.entrySet()) { + String key = entry.getKey(); + String value = entry.getValue(); + saveSqls.addAll(parseSecondHalf(SAVE_MID_TABLE_SENTENCE_TEMPLATE_CREATE_WITH_ENV, value, key, saveMidTableName, runDate, date, format)); + } + saveSqls.add(END_EXIST); + return saveSqls; } private void formatSchema(List sparkSqlList, String variableFormer, String variableLatter) { @@ -696,6 +1540,20 @@ private void formatSchema(List sparkSqlList, String variableFormer, Stri sparkSqlList.add(str3); } + private void formatSchema(List sparkSqlList, String prefix, String variableFormer, String variableLatter) { + if (QualitisConstants.BDAP.equals(localConfig.getCluster())) { + prefix = prefix.split(SpecCharEnum.EQUAL.getValue())[1]; + String str1 = "val " + prefix + "_schemas = " + variableFormer + ".schema.fields.map(f => f.name).toList"; + String str2 = "val " + prefix + "_replacedSchemas = " + prefix + "_schemas.map(s => s.replaceAll(\"[()]\", \"\")).toList"; + String str3 = "val " + variableLatter + " = " + variableFormer + ".toDF(" + prefix + "_replacedSchemas: _*)"; + sparkSqlList.add(str1); + sparkSqlList.add(str2); + sparkSqlList.add(str3); + } else { + sparkSqlList.add("val " + variableLatter + " = " + variableFormer); + } + } + private List getSaveMidTableSentenceSettings() { List settings = new ArrayList<>(); settings.add("spark.sqlContext.setConf(\"hive.exec.dynamic.partition\", \"true\")"); @@ -705,41 +1563,140 @@ private List getSaveMidTableSentenceSettings() { return settings; } - private List getSaveMidTableSentence(String saveMidTableName, Integer count, String runDate) { + private List getSaveMidTableSentence(String saveMidTableName, Integer count, String runDate, boolean midTableReUse) { + SimpleDateFormat format = new SimpleDateFormat("yyyyMMdd"); + Calendar calendar = Calendar.getInstance(); + calendar.add(Calendar.DATE, -7); + Date monday = calendar.getTime(); Date date = new Date(); + + String beforeDay = format.format(monday); + + List saveSqls = new ArrayList<>(); + + if (!midTableReUse) { + saveSqls.add("spark.sql(\"DROP TABLE IF EXISTS " + saveMidTableName + "\")"); + } + + saveSqls.add(IF_EXIST.replace(SAVE_MID_TABLE_NAME_PLACEHOLDER, saveMidTableName)); + + // Delete 7 days before partition + String getPartitions = "val partition_list_" + saveMidTableName.replace(SpecCharEnum.PERIOD_NO_ESCAPE.getValue(), SpecCharEnum.BOTTOM_BAR.getValue()) + " = spark.sql(\"select qualitis_partition_key from " + saveMidTableName + " where (qualitis_partition_key < " + beforeDay + ")\").map(f=>f.getString(0)).collect.toList"; + saveSqls.add(getPartitions); + String foreachDrop = "partition_list_" + saveMidTableName.replace(SpecCharEnum.PERIOD_NO_ESCAPE.getValue(), SpecCharEnum.BOTTOM_BAR.getValue()) + ".foreach(f => spark.sql(\"alter table " + saveMidTableName + " drop if exists partition (qualitis_partition_key=\" + f + \")\"))"; + saveSqls.add(foreachDrop); + + saveSqls.addAll(parsefirstHalf(SAVE_MID_TABLE_SENTENCE_TEMPLATE_INSERT_OVERWRITE_PARTITION, "", getVariableName(count), saveMidTableName, runDate, date, format)); + saveSqls.add(ELSE_EXIST); + + saveSqls.addAll(parseSecondHalf(SAVE_MID_TABLE_SENTENCE_TEMPLATE_CREATE, "", getVariableName(count), saveMidTableName, runDate, date, format)); + saveSqls.add(END_EXIST); + return saveSqls; + } + + private List getSaveMidTableSentence(String saveMidTableName, String partOfVariableName, String runDate, boolean midTableReUse) { SimpleDateFormat format = new SimpleDateFormat("yyyyMMdd"); + Calendar calendar = Calendar.getInstance(); + calendar.add(Calendar.DATE, -7); + Date monday = calendar.getTime(); + Date date = new Date(); + + String beforeDay = format.format(monday); + List saveSqls = new ArrayList<>(); + + if (!midTableReUse) { + saveSqls.add("spark.sql(\"DROP TABLE IF EXISTS " + saveMidTableName + "\")"); + } + saveSqls.add(IF_EXIST.replace(SAVE_MID_TABLE_NAME_PLACEHOLDER, saveMidTableName)); - saveSqls.add(SAVE_MID_TABLE_SENTENCE_TEMPLATE_INSERT_OVERWRITE_PARTITION.replace("${QUALITIS_PARTITION_KEY}", StringUtils.isBlank(runDate) ? format.format(date) : runDate) - .replace(SAVE_MID_TABLE_NAME_PLACEHOLDER, saveMidTableName).replace(VARIABLE_NAME_PLACEHOLDER, getVariableName(count))); + + // Delete 7 days before partition + String getPartitions = "val partition_list_" + saveMidTableName.replace(SpecCharEnum.PERIOD_NO_ESCAPE.getValue(), SpecCharEnum.BOTTOM_BAR.getValue()) + " = spark.sql(\"select qualitis_partition_key from " + saveMidTableName + " where (qualitis_partition_key < " + beforeDay + ")\").map(f=>f.getString(0)).collect.toList"; + saveSqls.add(getPartitions); + String foreachDrop = "partition_list_" + saveMidTableName.replace(SpecCharEnum.PERIOD_NO_ESCAPE.getValue(), SpecCharEnum.BOTTOM_BAR.getValue()) + ".foreach(f => spark.sql(\"alter table " + saveMidTableName + " drop if exists partition (qualitis_partition_key=\" + f + \")\"))"; + saveSqls.add(foreachDrop); + + saveSqls.addAll(parsefirstHalf(SAVE_MID_TABLE_SENTENCE_TEMPLATE_INSERT_OVERWRITE_PARTITION, "", getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName), saveMidTableName, runDate, date, format)); saveSqls.add(ELSE_EXIST); - saveSqls.add(SAVE_MID_TABLE_SENTENCE_TEMPLATE_CREATE.replace("${QUALITIS_PARTITION_KEY}", StringUtils.isBlank(runDate) ? format.format(date) : runDate) - .replace(SAVE_MID_TABLE_NAME_PLACEHOLDER, saveMidTableName).replace(VARIABLE_NAME_PLACEHOLDER, getVariableName(count))); + + saveSqls.addAll(parseSecondHalf(SAVE_MID_TABLE_SENTENCE_TEMPLATE_CREATE, "", getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName), saveMidTableName, runDate, date, format)); saveSqls.add(END_EXIST); return saveSqls; } - private String getSparkSqlSentence(String sql, Integer count) { + private List parsefirstHalf(String saveMidTableSentenceTemplateInsertOverwritePartition, String envName, String val, String saveMidTableName + , String runDate, Date date, SimpleDateFormat format) { + List saveSqls = new ArrayList<>(); + + if (StringUtils.isEmpty(envName) && StringUtils.isEmpty(val)) { + saveSqls.add(saveMidTableSentenceTemplateInsertOverwritePartition + .replace("${QUALITIS_PARTITION_KEY}", StringUtils.isBlank(runDate) ? format.format(date) : runDate) + .replace(SAVE_MID_TABLE_NAME_PLACEHOLDER, saveMidTableName).replace(VARIABLE_NAME_PLACEHOLDER, val)); + } else { + saveSqls.add(saveMidTableSentenceTemplateInsertOverwritePartition + .replace("${QUALITIS_PARTITION_KEY}", StringUtils.isBlank(runDate) ? format.format(date) : runDate) + .replace("${QUALITIS_PARTITION_KEY_ENV}", envName) + .replace(SAVE_MID_TABLE_NAME_PLACEHOLDER, saveMidTableName).replace(VARIABLE_NAME_PLACEHOLDER, val)); + } + + return saveSqls; + } + + private List parseSecondHalf(String saveMidTableSentenceTemplateCreate, String envName, String val, String saveMidTableName, String runDate + , Date date, SimpleDateFormat format) { + List saveSqls = new ArrayList<>(); + if (StringUtils.isEmpty(envName) && StringUtils.isEmpty(val)) { + saveSqls.add( + saveMidTableSentenceTemplateCreate.replace("${QUALITIS_PARTITION_KEY}", StringUtils.isBlank(runDate) ? format.format(date) : runDate) + .replace(SAVE_MID_TABLE_NAME_PLACEHOLDER, saveMidTableName).replace(VARIABLE_NAME_PLACEHOLDER, val)); + } else { + saveSqls.add( + saveMidTableSentenceTemplateCreate.replace("${QUALITIS_PARTITION_KEY}", StringUtils.isBlank(runDate) ? format.format(date) : runDate) + .replace("${QUALITIS_PARTITION_KEY_ENV}", envName) + .replace(SAVE_MID_TABLE_NAME_PLACEHOLDER, saveMidTableName).replace(VARIABLE_NAME_PLACEHOLDER, val)); + } + return saveSqls; + } + + private String getSparkSqlSentence(String sql, Integer count, String filterFields) { sql = sql.replace("\"", "\\\""); String str = SPARK_SQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, sql); + if (StringUtils.isNotEmpty(filterFields)) { + str += filterFields; + } return str.replace(VARIABLE_NAME_PLACEHOLDER, getVariableName(count)); } + private String getSparkSqlSentence(String sql, String fullName, String filterFields, String shareFromPart, String envName) { + sql = sql.replace("\"", "\\\""); + + if (StringUtils.isNotEmpty(shareFromPart)) { + sql = sql.replace(shareFromPart, commonTableName + envName); + } + String str = SPARK_SQL_TEMPLATE.replace(SPARK_SQL_TEMPLATE_PLACEHOLDER, sql); + if (StringUtils.isNotEmpty(filterFields)) { + str += filterFields; + } + return str.replace(VARIABLE_NAME_PLACEHOLDER, getVariableNameByRule(fullName.split(SpecCharEnum.EQUAL.getValue())[0], fullName.split(SpecCharEnum.EQUAL.getValue())[1])); + } + /** * Replace all placeholder of template sql + * * @param template * @param variables * @param filter * @param realFilter - * @param dbTableMap for pick up source db.table & target db.table - * @param mappings + * @param realColumn + * @param dbTableMap for pick up source db.table & target db.table * @param date * @return * @throws ConvertException */ - private String replaceVariable(String template, List variables, String filter, StringBuffer realFilter, Map dbTableMap - , StringBuffer mappings, Date date) - throws ConvertException, UnExpectedRequestException { + private String replaceVariable(String template, List variables, String filter, StringBuilder realFilter, StringBuilder realColumn + , Map dbTableMap, Date date, String createUser) throws ConvertException, UnExpectedRequestException, MetaDataAcquireFailedException { + String sqlAction = template; if (StringUtils.isNotBlank(filter)) { String tmpfilter = DateExprReplaceUtil.replaceFilter(date, filter); @@ -752,30 +1709,30 @@ private String replaceVariable(String template, List variables, St for (RuleVariable ruleVariable : variables) { String midInputMetaPlaceHolder = ruleVariable.getTemplateMidTableInputMeta().getPlaceholder(); String placeHolder = "\\$\\{" + midInputMetaPlaceHolder + "}"; - // GeT source db and table, target db and table. - if ("source_db".equals(midInputMetaPlaceHolder)) { + // GeT left db and table, right db and table. + if ("left_database".equals(midInputMetaPlaceHolder)) { if (StringUtils.isNotBlank(ruleVariable.getValue())) { - dbTableMap.put("source_db", ruleVariable.getValue() + "."); + dbTableMap.put("left_database", ruleVariable.getValue() + "."); } else { - dbTableMap.put("source_db", ""); + dbTableMap.put("left_database", ""); } - } else if ("source_table".equals(midInputMetaPlaceHolder)) { - dbTableMap.put("source_table", ruleVariable.getValue()); - } else if ("target_table".equals(midInputMetaPlaceHolder)) { - dbTableMap.put("target_table", ruleVariable.getValue()); - } else if ("target_db".equals(midInputMetaPlaceHolder)) { + } else if ("left_table".equals(midInputMetaPlaceHolder)) { + dbTableMap.put("left_table", ruleVariable.getValue()); + } else if ("right_table".equals(midInputMetaPlaceHolder)) { + dbTableMap.put("right_table", ruleVariable.getValue()); + } else if ("right_database".equals(midInputMetaPlaceHolder)) { if (StringUtils.isNotBlank(ruleVariable.getValue())) { - dbTableMap.put("target_db", ruleVariable.getValue() + "."); + dbTableMap.put("right_database", ruleVariable.getValue() + "."); } else { - dbTableMap.put("target_db", ""); + dbTableMap.put("right_database", ""); } - } else if ("mapping_argument".equals(midInputMetaPlaceHolder)) { - mappings.append(ruleVariable.getValue()); + } else if (TemplateInputTypeEnum.FIELD.getCode().equals(ruleVariable.getTemplateMidTableInputMeta().getInputType()) && Boolean.TRUE.equals(ruleVariable.getTemplateMidTableInputMeta().getFieldMultipleChoice())) { + realColumn.append(ruleVariable.getValue()); } // Fix issue of wedget node in the front. - if ("\\$\\{field}".equals(placeHolder)) { + if ("\\$\\{fields}".equals(placeHolder)) { Matcher matcher = AGGREGATE_FUNC_PATTERN.matcher(ruleVariable.getValue()); - while(matcher.find()) { + while (matcher.find()) { String[] funcs = matcher.group().split("\n"); for (String func : funcs) { ruleVariable.setValue(ruleVariable.getValue().replace(func, "`" + func + "`")); @@ -788,11 +1745,17 @@ private String replaceVariable(String template, List variables, St } else { sqlAction = sqlAction.replaceAll(placeHolder, ruleVariable.getValue()); } - LOGGER.info("Succeed to replace {} into {}", placeHolder, ruleVariable.getValue()); } + + // Fix rule history + String contrastType = "{contrast_type}"; + if (sqlAction.contains(SpecCharEnum.DOLLAR.getValue() + contrastType)) { + sqlAction = sqlAction.replaceAll("\\$\\{contrast_type}", "FULL OUTER JOIN"); + } + if (PLACEHOLDER_PATTERN.matcher(sqlAction).matches()) { - throw new ConvertException("Unable to convert SQL, replacing placeholders failed, still having placeholder."); + throw new ConvertException("Unable to convert SQL, replacing placeholders failed, still having placeholder. sql: " + sqlAction); } return sqlAction; @@ -800,10 +1763,22 @@ private String replaceVariable(String template, List variables, St /** * Get tmp variable name + * * @param count * @return */ public String getVariableName(Integer count) { return "tmp" + count; } + + /** + * Get tmp variable name + * + * @param optPhase + * @param partOfVariableName + * @return + */ + public String getVariableNameByRule(String optPhase, String partOfVariableName) { + return optPhase + "Of" + partOfVariableName; + } } diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/converter/TemplateConverterFactory.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/converter/TemplateConverterFactory.java index 446034aa..ee55dd52 100644 --- a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/converter/TemplateConverterFactory.java +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/converter/TemplateConverterFactory.java @@ -19,8 +19,6 @@ import com.webank.wedatasphere.qualitis.bean.DataQualityTask; import com.webank.wedatasphere.qualitis.constant.TaskTypeEnum; import com.webank.wedatasphere.qualitis.exception.TaskTypeException; -import com.webank.wedatasphere.qualitis.bean.DataQualityTask; -import com.webank.wedatasphere.qualitis.constant.TaskTypeEnum; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Component; diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/translator/AbstractTranslator.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/translator/AbstractTranslator.java index ae5f2e69..45af8e49 100644 --- a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/translator/AbstractTranslator.java +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/translator/AbstractTranslator.java @@ -21,7 +21,6 @@ import com.webank.wedatasphere.qualitis.rule.entity.RuleVariable; import com.webank.wedatasphere.qualitis.rule.entity.TemplateStatisticsInputMeta; -import java.text.ParseException; import java.util.List; import java.util.Map; import java.util.Set; @@ -32,21 +31,28 @@ public abstract class AbstractTranslator { /** * Generate persistence statement. + * @param persistenceTranslate * @param ruleId * @param ruleMetricMaps * @param templateStatisticsInputMetas * @param applicationId * @param ruleVariables * @param createTime - * @param count + * @param partOfVariableName + * @param date * @param runDate + * @param realColumn + * @param enumListNewValue + * @param numRangeNewValue + * @param selectResult + * @param unionAllForSaveResult * @return * @throws RuleVariableNotSupportException * @throws RuleVariableNotFoundException */ - public abstract List persistenceTranslate(Long ruleId, Map ruleMetricMaps, - Set templateStatisticsInputMetas, String applicationId, List ruleVariables, - String createTime, Integer count, String runDate) throws RuleVariableNotSupportException, RuleVariableNotFoundException; + public abstract List persistenceTranslate(String persistenceTranslate, Long ruleId, Map ruleMetricMaps + , Set templateStatisticsInputMetas, String applicationId, List ruleVariables, String createTime + , String partOfVariableName, String date, String runDate, StringBuilder realColumn, boolean enumListNewValue, boolean numRangeNewValue, Map selectResult, boolean unionAllForSaveResult) throws RuleVariableNotSupportException, RuleVariableNotFoundException; /** * Generate initial statement. diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/translator/JdbcTranslator.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/translator/JdbcTranslator.java index 4d647a14..9e394b19 100644 --- a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/translator/JdbcTranslator.java +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/translator/JdbcTranslator.java @@ -17,6 +17,7 @@ package com.webank.wedatasphere.qualitis.translator; import com.webank.wedatasphere.qualitis.config.TaskDataSourceConfig; +import com.webank.wedatasphere.qualitis.constant.OptTypeEnum; import com.webank.wedatasphere.qualitis.converter.SqlTemplateConverter; import com.webank.wedatasphere.qualitis.dao.RuleMetricDao; import com.webank.wedatasphere.qualitis.exception.RuleVariableNotFoundException; @@ -25,15 +26,6 @@ import com.webank.wedatasphere.qualitis.rule.constant.StatisticsValueTypeEnum; import com.webank.wedatasphere.qualitis.rule.entity.RuleVariable; import com.webank.wedatasphere.qualitis.rule.entity.TemplateStatisticsInputMeta; -import java.text.ParseException; -import java.text.SimpleDateFormat; -import java.util.ArrayList; -import java.util.Arrays; -import java.util.Date; -import java.util.List; -import java.util.Map; -import java.util.Set; -import javax.annotation.PostConstruct; import org.apache.commons.collections.CollectionUtils; import org.apache.commons.lang3.StringUtils; import org.slf4j.Logger; @@ -43,6 +35,14 @@ import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty; import org.springframework.context.annotation.Configuration; +import javax.annotation.PostConstruct; +import javax.servlet.http.HttpServletRequest; +import javax.ws.rs.core.Context; +import java.text.ParseException; +import java.text.SimpleDateFormat; +import java.util.*; +import java.util.stream.Collectors; + /** * Generate scala code of connecting mysql and save data into mysql * Example: @@ -50,6 +50,7 @@ * prop.setProperty("user", "*****"); * prop.setProperty("password", "*****"); * tmp1.selectExpr("count(*) as value", "'QUALITIS_20190117094607_649214' as task_id", "'Long' as result_type", "'26' as rule_id", "'2019-01-17 09:46:07' as create_time").write.mode(org.apache.spark.sql.SaveMode.Append).jdbc("jdbc:mysql://host:port/*****", "application_task_result", prop);, + * * @author howeye */ @Configuration @@ -58,39 +59,72 @@ public class JdbcTranslator extends AbstractTranslator { @Value("${task.persistent.username}") private String mysqlUsername; @Value("${task.persistent.password}") - private String mysqlPassword; + private String mysqlSecret; @Value("${task.persistent.address}") private String mysqlAddress; @Value("${task.persistent.tableName}") private String resultTableName; + @Value("${task.new_value.tableName}") + private String newValueTableName; + @Value("${task.new_value.save}") + private String newValueTableSave; + @Value("${task.persistent.encrypt: false}") + private Boolean isEncrypt; + @Autowired private RuleMetricDao ruleMetricDao; + @Autowired private TaskDataSourceConfig taskDataSourceConfig; + private static final String PROP_VARIABLE_NAME = "prop"; private static final String STATISTICS_VALUE_FIELD_NAME = "value"; - private static final String STATISTICS_APPLICATION_ID_FIELD_NAME = "application_id"; private static final String STATISTICS_RULE_ID_FIELD_NAME = "rule_id"; + private static final String STATISTICS_APPLICATION_ID_FIELD_NAME = "application_id"; private static final String STATISTICS_RULE_METRIC_ID_FIELD_NAME = "rule_metric_id"; private static final String STATISTICS_RUN_DATE_FIELD_NAME = "run_date"; + private static final String STATISTICS_ENV_NAME_FIELD_NAME = "env_name"; private static final String STATISTICS_RESULT_FILED_TYPE = "result_type"; private static final String STATISTICS_CREATE_TIME = "create_time"; + private static final String STATISTICS_STATUS = "status"; + private static final String STATISTICS_VERSION = "version"; + private static final String STATISTICS_RULE_VERSION = "rule_version"; + private static final String STATISTICS_CREATE_USER = "create_user"; + private static final String STATISTICS_RESULT_VALUE = "result_value"; + private static final String STATISTICS_CUSTOM_COLUMN = "custom_column"; + private static final String STATISTICS_CONCAT_NEW_REAL_COLUMN = "${NEW_REAL_COLUMN}"; + private static final String STATISTICS_CONCAT_WS = "concat_ws(',',${NEW_REAL_COLUMN})"; private static final String STATISTICS_VALUE_PLACEHOLDER = "${VALUE}"; - private static final String STATISTICS_APPLICATION_ID_PLACEHOLDER = "${APPLICATION_ID}"; private static final String STATISTICS_RULE_ID_PLACEHOLDER = "${RULE_ID}"; - private static final String STATISTICS_RULE_METRIC_ID_PLACEHOLDER = "${RULE_METRIC_ID}"; + private static final String STATISTICS_VERSION_PLACEHOLDER = "${VERSION}"; private static final String STATISTICS_RUN_DATE_PLACEHOLDER = "${RUN_DATE}"; + private static final String STATISTICS_ENV_NAME_PLACEHOLDER = "${ENV_NAME}"; + private static final String STATISTICS_APPLICATION_ID_PLACEHOLDER = "${APPLICATION_ID}"; + private static final String STATISTICS_RULE_METRIC_ID_PLACEHOLDER = "${RULE_METRIC_ID}"; private static final String STATISTICS_RESULT_TYPE_PLACEHOLDER = "${RESULT_TYPE}"; private static final String STATISTICS_CREATE_TIME_PLACEHOLDER = "${CREATE_TIME}"; + private static final String STATISTICS_CREATE_USER_PLACEHOLDER = "${CREATE_USER}"; + private static final String STATISTICS_RULE_VERSION_PLACEHOLDER = "${RULE_VERSION}"; private static final String DECLARE_PROP_SENTENCE = "val " + PROP_VARIABLE_NAME + " = new java.util.Properties;"; + private static final Integer ONE = 1; + private String usernamePropSentence; private String passwordPropSentence; + private String taskNewVauleTemplate; + private String statisticsValueTemplate; + private String taskNumberRangeTemplate; private String statisticsAndSaveResultTemplate; + private static final Logger LOGGER = LoggerFactory.getLogger(JdbcTranslator.class); + private HttpServletRequest httpServletRequest; + + public JdbcTranslator(@Context HttpServletRequest httpServletRequest) { + this.httpServletRequest = httpServletRequest; + } /** * Initial statement @@ -98,39 +132,80 @@ public class JdbcTranslator extends AbstractTranslator { @PostConstruct public void init() { usernamePropSentence = PROP_VARIABLE_NAME + ".setProperty(\"user\", \"" + mysqlUsername + "\");"; - String password = taskDataSourceConfig.getPassword(); - passwordPropSentence = PROP_VARIABLE_NAME + ".setProperty(\"password\", \"" + password + "\");"; + if (isEncrypt) { +// String passwordPrivateKey = taskDataSourceConfig.getPrivateKey(); +// try { +// mysqlSecret = EncryptUtil.decrypt(passwordPrivateKey, taskDataSourceConfig.getPassword()); +// } catch (Exception e) { +// LOGGER.error("Decrypt mysqlsec password exception.", e); +// } + } else { + mysqlSecret = taskDataSourceConfig.getPassword(); + } + passwordPropSentence = PROP_VARIABLE_NAME + ".setProperty(\"password\", \"" + mysqlSecret + "\");"; statisticsAndSaveResultTemplate = SqlTemplateConverter.VARIABLE_NAME_PLACEHOLDER + ".selectExpr(\"" + STATISTICS_VALUE_PLACEHOLDER + " as " + STATISTICS_VALUE_FIELD_NAME + "\", \"'" + STATISTICS_APPLICATION_ID_PLACEHOLDER + "' as " + STATISTICS_APPLICATION_ID_FIELD_NAME + "\", \"'" + STATISTICS_RESULT_TYPE_PLACEHOLDER + "' as " + STATISTICS_RESULT_FILED_TYPE + "\", \"'" + STATISTICS_RULE_ID_PLACEHOLDER + "' as " + STATISTICS_RULE_ID_FIELD_NAME + "\", \"'" + + STATISTICS_VERSION_PLACEHOLDER + "' as " + STATISTICS_VERSION + "\", \"'" + STATISTICS_RULE_METRIC_ID_PLACEHOLDER + "' as " + STATISTICS_RULE_METRIC_ID_FIELD_NAME + "\", \"'" + STATISTICS_RUN_DATE_PLACEHOLDER + "' as " + STATISTICS_RUN_DATE_FIELD_NAME + "\", \"'" + + STATISTICS_ENV_NAME_PLACEHOLDER + "' as " + STATISTICS_ENV_NAME_FIELD_NAME + "\", \"'" + STATISTICS_CREATE_TIME_PLACEHOLDER + "' as " + STATISTICS_CREATE_TIME + "\").write.mode(org.apache.spark.sql.SaveMode.Append).jdbc(\"" + mysqlAddress + "\", \"" + resultTableName + "\", " + PROP_VARIABLE_NAME + ");"; + statisticsValueTemplate = SqlTemplateConverter.VARIABLE_NAME_PLACEHOLDER + ".selectExpr(\"" + + STATISTICS_VALUE_PLACEHOLDER + " as " + STATISTICS_VALUE_FIELD_NAME + "\")" + ".rdd.map(r => r(0)).collect()"; + //初始化新值 spark sql拼接 + taskNewVauleTemplate = SqlTemplateConverter.VARIABLE_NAME_PLACEHOLDER + ".selectExpr(\"" + + STATISTICS_RULE_ID_PLACEHOLDER + " as " + STATISTICS_RULE_ID_FIELD_NAME + "\", \"" + + ONE + " as " + STATISTICS_STATUS + "\", \"'" + + STATISTICS_CREATE_USER_PLACEHOLDER + "' as " + STATISTICS_CREATE_USER + "\", \"'" + + STATISTICS_RULE_VERSION_PLACEHOLDER + "' as " + STATISTICS_RULE_VERSION + "\", \"'" + + STATISTICS_CREATE_TIME_PLACEHOLDER + "' as " + STATISTICS_CREATE_TIME + "\", \"" + + STATISTICS_CONCAT_WS + " as " + STATISTICS_RESULT_VALUE + + "\")." + newValueTableSave + ".jdbc(\"" + mysqlAddress + "\", \"" + newValueTableName + "\", " + + PROP_VARIABLE_NAME + ");"; + //初始化数值范围 sqark sql拼接 + taskNumberRangeTemplate = SqlTemplateConverter.VARIABLE_NAME_PLACEHOLDER + ".selectExpr(\"" + + STATISTICS_RULE_ID_PLACEHOLDER + " as " + STATISTICS_RULE_ID_FIELD_NAME + "\", \"" + + ONE + " as " + STATISTICS_STATUS + "\", \"'" + + STATISTICS_CREATE_USER_PLACEHOLDER + "' as " + STATISTICS_CREATE_USER + "\", \"'" + + STATISTICS_RULE_VERSION_PLACEHOLDER + "' as " + STATISTICS_RULE_VERSION + "\", \"'" + + STATISTICS_CREATE_TIME_PLACEHOLDER + "' as " + STATISTICS_CREATE_TIME + "\", \"" + + STATISTICS_CUSTOM_COLUMN + " as " + STATISTICS_RESULT_VALUE + + "\")." + newValueTableSave + ".jdbc(\"" + mysqlAddress + "\", \"" + newValueTableName + "\", " + + PROP_VARIABLE_NAME + ");"; } /** * Generate statistic statement and save mysql statement + * * @param ruleId * @param ruleMetricMaps * @param templateStatisticsInputMetas * @param applicationId * @param ruleVariables * @param createTime - * @param count + * @param partOfVariableName * @param runDate + * @param user + * @param realColumn + * @param enumListNewValue + * @param numRangeNewValue + * @param selectResult + * @param unionAllForSaveResult * @return * @throws RuleVariableNotSupportException * @throws RuleVariableNotFoundException */ @Override - public List persistenceTranslate(Long ruleId, Map ruleMetricMaps, Set templateStatisticsInputMetas - , String applicationId, List ruleVariables, String createTime, Integer count, String runDate) throws RuleVariableNotSupportException, RuleVariableNotFoundException { + public List persistenceTranslate(String workFlowVersion, Long ruleId, Map ruleMetricMaps, Set templateStatisticsInputMetas, + String applicationId, List ruleVariables, String createTime, String partOfVariableName, String runDate, String user, StringBuilder realColumn, boolean enumListNewValue, boolean numRangeNewValue, Map selectResult, boolean unionAllForSaveResult) throws RuleVariableNotSupportException, RuleVariableNotFoundException { List list = new ArrayList<>(); - list.addAll(getStatisticsAndSaveSentence(ruleId, ruleMetricMaps, templateStatisticsInputMetas, applicationId, ruleVariables, createTime, count, runDate)); + list.addAll(getStatisticsAndSaveSentence(workFlowVersion,ruleId, ruleMetricMaps, templateStatisticsInputMetas, applicationId, ruleVariables, createTime, partOfVariableName + , runDate, user, realColumn, enumListNewValue, numRangeNewValue, selectResult, unionAllForSaveResult)); return list; } @@ -145,36 +220,52 @@ private String getDriver() { /** * Replace all place holder in sql, and generate save mysql statement + * + * @param workFlowVersion * @param ruleId * @param ruleMetricMap * @param templateStatisticsInputMetas * @param applicationId * @param ruleVariables * @param createTime - * @param count + * @param partOfVariableName * @param runDate + * @param realColumn + * @param enumListNewValue + * @param numRangeNewValue + * @param selectResult + * @param unionAllForSaveResult * @return * @throws RuleVariableNotSupportException * @throws RuleVariableNotFoundException */ - private List getStatisticsAndSaveSentence(Long ruleId, Map ruleMetricMap, - Set templateStatisticsInputMetas, String applicationId, List ruleVariables, - String createTime, Integer count, String runDate) throws RuleVariableNotSupportException, RuleVariableNotFoundException { + private List getStatisticsAndSaveSentence(String workFlowVersion, Long ruleId, Map ruleMetricMap + , Set templateStatisticsInputMetas, String applicationId, List ruleVariables + , String createTime, String partOfVariableName, String runDate, String user, StringBuilder realColumn, boolean enumListNewValue, boolean numRangeNewValue, Map selectResult, boolean unionAllForSaveResult) throws RuleVariableNotSupportException, RuleVariableNotFoundException { + List list = new ArrayList<>(); + Map newRuleMetricMap = new HashMap<>(2); + if (CollectionUtils.isNotEmpty(ruleMetricMap.entrySet())) { + for (Map.Entry entry : ruleMetricMap.entrySet()) { + String key = entry.getKey(); + Long value = entry.getValue(); + newRuleMetricMap.put(key.replace("-", "_"), value); + } + } if (StringUtils.isBlank(runDate)) { - sentenceWithoutRunDate(templateStatisticsInputMetas, ruleVariables, list, applicationId, createTime, count, ruleId, ruleMetricMap); + sentenceWithoutRunDate(workFlowVersion, templateStatisticsInputMetas, ruleVariables, list, applicationId, createTime, partOfVariableName, ruleId, newRuleMetricMap, user, enumListNewValue, numRangeNewValue, realColumn, selectResult, unionAllForSaveResult); } else { - sentenceWithRunDate(templateStatisticsInputMetas, ruleVariables, list, applicationId, createTime, count, ruleId, ruleMetricMap, runDate); - + sentenceWithRunDate(workFlowVersion, templateStatisticsInputMetas, ruleVariables, list, applicationId, createTime, partOfVariableName, ruleId, newRuleMetricMap, user, runDate, enumListNewValue, numRangeNewValue, realColumn, selectResult, unionAllForSaveResult); } return list; } - private void sentenceWithRunDate(Set templateStatisticsInputMetas, List ruleVariables, - List list, String applicationId, String createTime, Integer count, Long ruleId, Map ruleMetricMap, String runDate) - throws RuleVariableNotSupportException, RuleVariableNotFoundException { - Date runRealDate = null; + private void sentenceWithRunDate(String workFlowVersion, Set templateStatisticsInputMetas, + List ruleVariables, List list, String applicationId, String createTime, String partOfVariableName, Long ruleId, Map ruleMetricMap, + String user, String runDate, boolean enumListNewValue, boolean numRangeNewValue, StringBuilder realColumn, Map selectResult, boolean unionAllForSaveResult) throws RuleVariableNotSupportException, RuleVariableNotFoundException { + Date runRealDate; + try { runRealDate = new SimpleDateFormat("yyyyMMdd").parse(runDate); } catch (ParseException e) { @@ -185,90 +276,178 @@ private void sentenceWithRunDate(Set templateStatis list.add("classOf[com.mysql.jdbc.Driver]"); list.add("try {"); list.add("\tconnection = DriverManager.getConnection(\"" + mysqlAddress + "\", " + PROP_VARIABLE_NAME + ")"); + if (selectResult != null && CollectionUtils.isNotEmpty(selectResult.keySet())) { + List varList = selectResult.keySet().stream().collect(Collectors.toList()); + + for (String variable : varList) { + // 聚合处理 + if (unionAllForSaveResult) { + constructStaticSqlWithRunDate(templateStatisticsInputMetas, ruleVariables, runRealDate, applicationId, createTime, ruleId, variable, selectResult.get(variable), workFlowVersion, ruleMetricMap, list); + break; + } + constructStaticSqlWithRunDate(templateStatisticsInputMetas, ruleVariables, runRealDate, applicationId, createTime, ruleId, variable, selectResult.get(variable), workFlowVersion, ruleMetricMap, list); + } + list.add("} catch {"); + list.add("\tcase e: Exception => println(\"JDBC operations failed because of \", e.getMessage())"); + list.add("} finally {"); + list.add("\tconnection.close()"); + list.add("}"); + // Handle new value + handleNewValue(workFlowVersion, user, realColumn, createTime, partOfVariableName, ruleId, list, enumListNewValue, numRangeNewValue, ""); + return; + } + + constructStaticSqlWithRunDate(templateStatisticsInputMetas, ruleVariables, runRealDate, applicationId, createTime, ruleId, getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName), "", workFlowVersion, ruleMetricMap, list); + + list.add("} catch {"); + list.add("\tcase e: Exception => println(\"JDBC operations failed because of \", e.getMessage())"); + list.add("} finally {"); + list.add("\tconnection.close()"); + list.add("}"); + + // Handle new value + handleNewValue(workFlowVersion, user, realColumn, createTime, partOfVariableName, ruleId, list, enumListNewValue, numRangeNewValue, ""); + } + + private void constructStaticSqlWithRunDate(Set templateStatisticsInputMetas, List ruleVariables, Date runRealDate + , String applicationId, String createTime, Long ruleId, String variable, String envName, String workFlowVersion, Map ruleMetricMap, List list) throws RuleVariableNotSupportException, RuleVariableNotFoundException { for (TemplateStatisticsInputMeta s : templateStatisticsInputMetas) { String funcName = s.getFuncName(); String value = getValue(ruleVariables, s); String persistSentence = statisticsAndSaveResultTemplate .replace(STATISTICS_VALUE_PLACEHOLDER, funcName + "(" + value + ")") - .replace(STATISTICS_APPLICATION_ID_PLACEHOLDER, applicationId) .replace(STATISTICS_RESULT_TYPE_PLACEHOLDER, s.getResultType()) + .replace(STATISTICS_APPLICATION_ID_PLACEHOLDER, applicationId) .replace(STATISTICS_CREATE_TIME_PLACEHOLDER, createTime) - .replace(SqlTemplateConverter.VARIABLE_NAME_PLACEHOLDER, getVariable(count)) - .replace(STATISTICS_RULE_ID_PLACEHOLDER, ruleId + ""); + .replace(STATISTICS_RULE_ID_PLACEHOLDER, ruleId + "") + .replace(STATISTICS_ENV_NAME_PLACEHOLDER, envName) + .replace(SqlTemplateConverter.VARIABLE_NAME_PLACEHOLDER, variable) + .replace(STATISTICS_VERSION_PLACEHOLDER, workFlowVersion + ""); + persistSentence = persistSentence.replace(STATISTICS_RUN_DATE_PLACEHOLDER, runRealDate.getTime() + ""); - StringBuffer selectSql = new StringBuffer(); - StringBuffer deleteSql = new StringBuffer(); - String varName = s.getName().replace("{", "").replace("}", "").replace("&", ""); - if (ruleMetricMap.get(value) != null) { - persistSentence = persistSentence.replace(STATISTICS_RULE_METRIC_ID_PLACEHOLDER, ruleMetricMap.get(value) + ""); - selectSql.append("val selectSql").append("_").append(varName) + String statisticsValueSentence = "val res_" + variable + " = " + statisticsValueTemplate.replace(STATISTICS_VALUE_PLACEHOLDER, funcName + "(" + value + ")") + .replace(SqlTemplateConverter.VARIABLE_NAME_PLACEHOLDER, variable); + String realValueName = "resValue_" + variable; + String realValueSentence = "val " + realValueName + " = " + "res_" + variable + "(0)"; + StringBuilder selectSql = new StringBuilder(); + StringBuilder updateSql = new StringBuilder(); + persistSentence = judgeRuleMetricMap(ruleId, ruleMetricMap, runRealDate, value, persistSentence, selectSql, updateSql, variable, realValueName, envName); + list.add("\t" + selectSql.toString()); + // Judge the existence of task result with rule ID, rule metric ID, run date. + list.add("\tval resultDF" + "_" + variable + " = spark.read.jdbc(\"" + mysqlAddress + "\", selectSql" + "_" + variable + ", prop)"); + list.add("\tval lines" + "_" + variable + " = resultDF" + "_" + variable + ".count()"); + list.add("\tif (lines" + "_" + variable + " >= 1) {"); + // Update the exist task result before insert. + list.add("\t\t" + statisticsValueSentence); + list.add("\t\t" + realValueSentence); + list.add("\t\t" + updateSql.toString()); + list.add("\t\tconnection.createStatement().executeUpdate(updateSql" + "_" + variable + ")"); + int index = persistSentence.indexOf("selectExpr(") + "selectExpr(".length(); + StringBuilder notSaveResultSentence = new StringBuilder(persistSentence); + list.add("\t\t" + notSaveResultSentence.insert(index, "\"0 as save_result\", ").toString()); + list.add("\t} else {"); + list.add("\t\t" + persistSentence); + list.add("\t}"); + LOGGER.info("Succeed to get persist sentence. sentence: {}", persistSentence); + } + } + + private String judgeRuleMetricMap(Long ruleId, Map ruleMetricMap, Date runRealDate, String value, String persistSentence, + StringBuilder selectSql, StringBuilder updateSql, String variable, String realValueName, String envName) { + if (ruleMetricMap.get(value) != null) { + persistSentence = persistSentence.replace(STATISTICS_RULE_METRIC_ID_PLACEHOLDER, ruleMetricMap.get(value) + ""); + selectSql.append("val selectSql").append("_").append(variable) .append(" = \"(select * from ").append(resultTableName).append(" where rule_id = ").append(ruleId) .append(" and rule_metric_id = ").append(ruleMetricMap.get(value)) + .append(" and save_result = 1") + .append(" and env_name = '").append(envName).append("'") .append(" and (run_date = ").append(runRealDate.getTime()) .append(")) qualitis_tmp_table\""); - deleteSql.append("val deleteSql").append("_").append(varName) - .append(" = \"delete from ").append(resultTableName).append(" where rule_id = ").append(ruleId) + updateSql.append("val updateSql").append("_").append(variable) + .append(" = \"update ").append(resultTableName).append(" set value = \"").append(" + ").append(realValueName).append(" + ").append("\" where rule_id = ").append(ruleId) .append(" and rule_metric_id = ").append(ruleMetricMap.get(value)) + .append(" and save_result = 1") + .append(" and env_name = '").append(envName).append("'") .append(" and (run_date = ").append(runRealDate.getTime()) .append(")\""); - } else { - if (CollectionUtils.isNotEmpty(ruleMetricMap.values())) { - persistSentence = persistSentence.replace(STATISTICS_RULE_METRIC_ID_PLACEHOLDER, ruleMetricMap.values().iterator().next() + ""); - selectSql.append("val selectSql").append("_").append(varName) + } else { + if (CollectionUtils.isNotEmpty(ruleMetricMap.values())) { + persistSentence = persistSentence.replace(STATISTICS_RULE_METRIC_ID_PLACEHOLDER, ruleMetricMap.values().iterator().next() + ""); + selectSql.append("val selectSql").append("_").append(variable) .append(" = \"(select * from ").append(resultTableName).append(" where rule_id = ").append(ruleId) .append(" and rule_metric_id = ").append(ruleMetricMap.values().iterator().next()) + .append(" and save_result = 1") + .append(" and env_name = '").append(envName).append("'") .append(" and (run_date = ").append(runRealDate.getTime()) .append(")) qualitis_tmp_table\""); - deleteSql.append("val deleteSql").append("_").append(varName) - .append(" = \"delete from ").append(resultTableName).append(" where rule_id = ").append(ruleId) + updateSql.append("val updateSql").append("_").append(variable) + .append(" = \"update ").append(resultTableName).append(" set value = \"").append(" + ").append(realValueName).append(" + ").append("\" where rule_id = ").append(ruleId) .append(" and rule_metric_id = ").append(ruleMetricMap.values().iterator().next()) + .append(" and save_result = 1") + .append(" and env_name = '").append(envName).append("'") .append(" and (run_date = ").append(runRealDate.getTime()) .append(")\""); - } else { - persistSentence = persistSentence.replace(STATISTICS_RULE_METRIC_ID_PLACEHOLDER, "-1"); - selectSql.append("val selectSql").append("_").append(varName) + } else { + persistSentence = persistSentence.replace(STATISTICS_RULE_METRIC_ID_PLACEHOLDER, "-1"); + selectSql.append("val selectSql").append("_").append(variable) .append(" = \"(select * from ").append(resultTableName).append(" where rule_id = ").append(ruleId) .append(" and rule_metric_id = ").append("-1") + .append(" and save_result = 1") + .append(" and env_name = '").append(envName).append("'") .append(" and (run_date = ").append(runRealDate.getTime()) .append(")) qualitis_tmp_table\""); - deleteSql.append("val deleteSql").append("_").append(varName) - .append(" = \"delete from ").append(resultTableName).append(" where rule_id = ").append(ruleId) + updateSql.append("val updateSql").append("_").append(variable) + .append(" = \"update ").append(resultTableName).append(" set value = \"").append(" + ").append(realValueName).append(" + ").append("\" where rule_id = ").append(ruleId) .append(" and rule_metric_id = ").append("-1") + .append(" and save_result = 1") + .append(" and env_name = '").append(envName).append("'") .append(" and (run_date = ").append(runRealDate.getTime()) .append(")\""); + } + } + return persistSentence; + } + + private void sentenceWithoutRunDate(String workFlowVersion, Set templateStatisticsInputMetas + , List ruleVariables, List list, String applicationId, String createTime, String partOfVariableName, Long ruleId + , Map ruleMetricMap, String user, boolean enumListNewValue, boolean numRangeNewValue, StringBuilder realColumn, Map selectResult, boolean unionAllForSaveResult) throws RuleVariableNotSupportException, RuleVariableNotFoundException { + if (selectResult != null && CollectionUtils.isNotEmpty(selectResult.keySet())) { + List varList = selectResult.keySet().stream().collect(Collectors.toList()); + + for (String variable : varList) { + // 聚合处理 + if (unionAllForSaveResult) { + constructStaticSql(templateStatisticsInputMetas, ruleVariables, applicationId, createTime, partOfVariableName, ruleId, workFlowVersion, ruleMetricMap, list, variable, selectResult.get(variable)); + // Handle new value + handleNewValue(workFlowVersion, user, realColumn, createTime, partOfVariableName, ruleId, list, enumListNewValue, numRangeNewValue, variable); + break; } + constructStaticSql(templateStatisticsInputMetas, ruleVariables, applicationId, createTime, partOfVariableName, ruleId, workFlowVersion, ruleMetricMap, list, variable, selectResult.get(variable)); + // Handle new value + handleNewValue(workFlowVersion, user, realColumn, createTime, partOfVariableName, ruleId, list, enumListNewValue, numRangeNewValue, variable); } - list.add(selectSql.toString()); - // Judge the existence of task result with rule ID, rule metric ID, run date. - list.add("val resultDF" + "_" + varName + " = spark.read.jdbc(\"" + mysqlAddress + "\", selectSql" + "_" + varName + ", prop)"); - list.add("val lines" + "_" + varName + " = resultDF" + "_" + varName + ".count()"); - list.add("if (lines" + "_" + varName + " >= 1) {"); - // Delete the exist task result before insert. - list.add(deleteSql.toString()); - list.add("connection.createStatement().executeUpdate(deleteSql" + "_" + varName + ")"); - list.add("}"); - list.add(persistSentence); - LOGGER.info("Succeed to get persist sentence. sentence: {}", persistSentence); + return; } - list.add("} catch {"); - list.add("case e: Exception => println(\"JDBC operations failed because of \", e.getMessage())"); - list.add("} finally {"); - list.add("\tconnection.close()"); - list.add("}"); + constructStaticSql(templateStatisticsInputMetas, ruleVariables, applicationId, createTime, partOfVariableName, ruleId, workFlowVersion, ruleMetricMap, list, "", ""); + + // Handle new value + handleNewValue(workFlowVersion, user, realColumn, createTime, partOfVariableName, ruleId, list, enumListNewValue, numRangeNewValue, ""); } - private void sentenceWithoutRunDate(Set templateStatisticsInputMetas, List ruleVariables, List list - , String applicationId, String createTime, Integer count, Long ruleId, Map ruleMetricMap) throws RuleVariableNotSupportException, RuleVariableNotFoundException { + private void constructStaticSql(Set templateStatisticsInputMetas, List ruleVariables, String applicationId + , String createTime, String partOfVariableName, Long ruleId, String workFlowVersion, Map ruleMetricMap, List list, String realVariable, String envName) throws RuleVariableNotSupportException, RuleVariableNotFoundException { for (TemplateStatisticsInputMeta s : templateStatisticsInputMetas) { String funcName = s.getFuncName(); String value = getValue(ruleVariables, s); String persistSentence = statisticsAndSaveResultTemplate .replace(STATISTICS_VALUE_PLACEHOLDER, funcName + "(" + value + ")") - .replace(STATISTICS_APPLICATION_ID_PLACEHOLDER, applicationId) .replace(STATISTICS_RESULT_TYPE_PLACEHOLDER, s.getResultType()) + .replace(STATISTICS_APPLICATION_ID_PLACEHOLDER, applicationId) .replace(STATISTICS_CREATE_TIME_PLACEHOLDER, createTime) - .replace(SqlTemplateConverter.VARIABLE_NAME_PLACEHOLDER, getVariable(count)) - .replace(STATISTICS_RULE_ID_PLACEHOLDER, ruleId + ""); + .replace(STATISTICS_RULE_ID_PLACEHOLDER, ruleId + "") + .replace(STATISTICS_ENV_NAME_PLACEHOLDER, envName + "") + .replace(SqlTemplateConverter.VARIABLE_NAME_PLACEHOLDER, StringUtils.isNotBlank(realVariable) ? realVariable : getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName)) + .replace(STATISTICS_VERSION_PLACEHOLDER, workFlowVersion + ""); if (ruleMetricMap.get(value) != null) { persistSentence = persistSentence.replace(STATISTICS_RULE_METRIC_ID_PLACEHOLDER, ruleMetricMap.get(value) + ""); @@ -283,10 +462,40 @@ private void sentenceWithoutRunDate(Set templateSta list.add(persistSentence); LOGGER.info("Succeed to get persist sentence. sentence: {}", persistSentence); } + + } + + private void handleNewValue(String workFlowVersion, String user, StringBuilder realColumn, String createTime, String partOfVariableName, Long ruleId, + List list, boolean enumListNewValue, boolean numRangeNewValue, String realVariable) { + //枚举值 新值替换 + if (enumListNewValue) { + String newValueDisplace = taskNewVauleTemplate + .replace(STATISTICS_CREATE_USER_PLACEHOLDER, user) + .replace(STATISTICS_CONCAT_NEW_REAL_COLUMN, realColumn.toString()) + .replace(STATISTICS_CREATE_TIME_PLACEHOLDER, createTime) + .replace(SqlTemplateConverter.VARIABLE_NAME_PLACEHOLDER, StringUtils.isNotBlank(realVariable) ? realVariable : getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName)) + .replace(STATISTICS_RULE_ID_PLACEHOLDER, ruleId + "") + .replace(STATISTICS_RULE_VERSION_PLACEHOLDER, workFlowVersion + ""); + list.add(newValueDisplace); + LOGGER.info("Succeed to get newValleDisplace. Vaule: {}", newValueDisplace); + } + + //数值范围 新值替换 + if (numRangeNewValue) { + String replacementFor = taskNumberRangeTemplate + .replace(STATISTICS_CREATE_USER_PLACEHOLDER, user) + .replace(STATISTICS_CREATE_TIME_PLACEHOLDER, createTime) + .replace(SqlTemplateConverter.VARIABLE_NAME_PLACEHOLDER, StringUtils.isNotBlank(realVariable) ? realVariable : getVariableNameByRule(OptTypeEnum.STATISTIC_DF.getMessage(), partOfVariableName)) + .replace(STATISTICS_RULE_ID_PLACEHOLDER, ruleId + "") + .replace(STATISTICS_RULE_VERSION_PLACEHOLDER, workFlowVersion + ""); + list.add(replacementFor); + LOGGER.info("Succeed to get replacementFor. Vaule: {}", replacementFor); + } } /** * Get argument value from statistics step + * * @param ruleVariables * @param templateStatisticsInputMeta * @return @@ -294,12 +503,12 @@ private void sentenceWithoutRunDate(Set templateSta * @throws RuleVariableNotFoundException */ private String getValue(List ruleVariables, TemplateStatisticsInputMeta templateStatisticsInputMeta) - throws RuleVariableNotSupportException, RuleVariableNotFoundException { + throws RuleVariableNotSupportException, RuleVariableNotFoundException { if (templateStatisticsInputMeta.getValueType().equals(StatisticsValueTypeEnum.FIXED_VALUE.getCode())) { return templateStatisticsInputMeta.getValue(); } else { for (RuleVariable ruleVariable : ruleVariables) { - if (!ruleVariable.getInputActionStep().equals(InputActionStepEnum.STATISTICS_ARG.getCode())) { + if (! ruleVariable.getInputActionStep().equals(InputActionStepEnum.STATISTICS_ARG.getCode())) { throw new RuleVariableNotSupportException("Action_step of rule_variable " + ruleVariable.getInputActionStep() + " does not support"); } @@ -327,10 +536,22 @@ private String getPropSentence() { /** * Get tmp variable + * * @param count * @return */ private String getVariable(Integer count) { return "tmp" + count; } + + /** + * Get tmp variable name + * + * @param optPhase + * @param partOfVariableName + * @return + */ + public String getVariableNameByRule(String optPhase, String partOfVariableName) { + return optPhase + "Of" + partOfVariableName; + } } diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/util/DateExprReplaceUtil.java b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/util/DateExprReplaceUtil.java index 9f465a22..e19208d8 100644 --- a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/util/DateExprReplaceUtil.java +++ b/core/converter/src/main/java/com/webank/wedatasphere/qualitis/util/DateExprReplaceUtil.java @@ -18,14 +18,14 @@ import com.webank.wedatasphere.qualitis.constant.SpecCharEnum; import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; -import java.util.HashMap; -import java.util.Map; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import java.text.SimpleDateFormat; import java.util.Calendar; import java.util.Date; +import java.util.HashMap; +import java.util.Map; import java.util.regex.Matcher; import java.util.regex.Pattern; @@ -45,12 +45,13 @@ private DateExprReplaceUtil() { private static final Pattern DIGITAL_PATTERN = Pattern.compile("[0-9]+"); private static final Pattern CUSTOM_PLACEHOLODER_PATTERN = Pattern.compile("\\$\\{[^ ]*}"); - private static final Map RUN_DATE_FORMAT = new HashMap(2){{ - put("run_date","yyyyMMdd"); - put("run_date_std", "yyyy-MM-dd"); - }}; - + private static final Map RUN_DATE_FORMAT = new HashMap(2); + static { + RUN_DATE_FORMAT.put("run_date","yyyyMMdd"); + RUN_DATE_FORMAT.put("run_date_std", "yyyy-MM-dd"); + RUN_DATE_FORMAT.put("run_today_h_std", "yyyy-MM-dd HH"); + } private static final Logger LOGGER = LoggerFactory.getLogger(DateExprReplaceUtil.class); @@ -108,7 +109,7 @@ public static String replaceRunDate(Date date, String midTableAction) throws UnE Matcher matcher = CUSTOM_PLACEHOLODER_PATTERN.matcher(midTableAction); while (matcher.find()) { String replaceStr = matcher.group(); - boolean legalSystemParams = replaceStr.contains("run_date") || replaceStr.contains("run_date_std"); + boolean legalSystemParams = replaceStr.contains("run_date") || replaceStr.contains("run_date_std") || replaceStr.contains("run_today_h_std"); if (! legalSystemParams) { throw new UnExpectedRequestException("Custom placeholoder must be system variables."); } @@ -121,6 +122,9 @@ public static String replaceRunDate(Date date, String midTableAction) throws UnE calendar.setTime(date); calendar.add(Calendar.DATE, 0 - forwayDay - 1); dateStr = new SimpleDateFormat(RUN_DATE_FORMAT.get(keys[0])).format(calendar.getTime()); + } else if ("run_today_h_std".equals(currentParam)){ + calendar.setTime(date); + dateStr = new SimpleDateFormat(RUN_DATE_FORMAT.get(currentParam)).format(calendar.getTime()); } else { calendar.setTime(date); calendar.add(Calendar.DATE, -1); @@ -137,7 +141,7 @@ public static String replaceFilter(Date date, String filter) throws UnExpectedRe Matcher matcher = CUSTOM_PLACEHOLODER_PATTERN.matcher(filter); while (matcher.find()) { String replaceStr = matcher.group(); - boolean legalSystemParams = replaceStr.contains("run_date") || replaceStr.contains("run_date_std"); + boolean legalSystemParams = replaceStr.contains("run_date") || replaceStr.contains("run_date_std") || replaceStr.contains("run_today_h_std"); if (! legalSystemParams) { throw new UnExpectedRequestException("Custom placeholoder must be system variables."); } @@ -150,6 +154,9 @@ public static String replaceFilter(Date date, String filter) throws UnExpectedRe calendar.setTime(date); calendar.add(Calendar.DATE, 0 - forwayDay - 1); dateStr = new SimpleDateFormat(RUN_DATE_FORMAT.get(keys[0])).format(calendar.getTime()); + } else if ("run_today_h_std".equals(currentParam)){ + calendar.setTime(date); + dateStr = new SimpleDateFormat(RUN_DATE_FORMAT.get(currentParam)).format(calendar.getTime()); } else { calendar.setTime(date); calendar.add(Calendar.DATE, -1); diff --git a/core/divider/src/main/java/com/webank/wedatasphere/qualitis/bean/DataQualityTask.java b/core/divider/src/main/java/com/webank/wedatasphere/qualitis/bean/DataQualityTask.java index 18ad326b..11ec43a1 100644 --- a/core/divider/src/main/java/com/webank/wedatasphere/qualitis/bean/DataQualityTask.java +++ b/core/divider/src/main/java/com/webank/wedatasphere/qualitis/bean/DataQualityTask.java @@ -17,27 +17,30 @@ package com.webank.wedatasphere.qualitis.bean; import com.webank.wedatasphere.qualitis.constant.TaskTypeEnum; +import com.webank.wedatasphere.qualitis.exception.ArgumentException; import com.webank.wedatasphere.qualitis.rule.constant.TemplateActionTypeEnum; import com.webank.wedatasphere.qualitis.rule.entity.Rule; -import com.webank.wedatasphere.qualitis.exception.ArgumentException; -import com.webank.wedatasphere.qualitis.constant.TaskTypeEnum; -import com.webank.wedatasphere.qualitis.exception.ArgumentException; - -import java.util.List; +import java.util.Map; import java.util.stream.Collectors; +import java.util.List; /** * @author howeye */ public class DataQualityTask { - private String applicationId; private Integer taskType; + private String applicationId; private List ruleTaskDetails; + private String startupParam; private String createTime; private String partition; private Long taskId; private String user; - private String startupParam; + private String dbShare; + private String tableShare; + private String filterShare; + private String columnShare; + private List> connectShare; public DataQualityTask() { } @@ -45,6 +48,7 @@ public DataQualityTask() { public DataQualityTask(String applicationId, String createTime, String partition, List ruleTaskDetails) throws ArgumentException { List rules = ruleTaskDetails.stream().map(RuleTaskDetail::getRule).collect(Collectors.toList()); List actionTypeList = rules.stream().map(rule -> rule.getTemplate().getActionType()).distinct().collect(Collectors.toList()); + if (actionTypeList.isEmpty()) { throw new ArgumentException("Error! Action type can not be null"); } @@ -64,26 +68,10 @@ public DataQualityTask(String applicationId, String createTime, String partition throw new ArgumentException("Error! Action type: [" + actionType + "] is not supported"); } + this.ruleTaskDetails = ruleTaskDetails; this.applicationId = applicationId; this.createTime = createTime; this.partition = partition; - this.ruleTaskDetails = ruleTaskDetails; - } - - public String getUser() { - return user; - } - - public void setUser(String user) { - this.user = user; - } - - public Long getTaskId() { - return taskId; - } - - public void setTaskId(Long taskId) { - this.taskId = taskId; } public Integer getTaskType() { @@ -102,6 +90,22 @@ public void setApplicationId(String applicationId) { this.applicationId = applicationId; } + public List getRuleTaskDetails() { + return ruleTaskDetails; + } + + public void setRuleTaskDetails(List ruleTaskDetails) { + this.ruleTaskDetails = ruleTaskDetails; + } + + public String getStartupParam() { + return startupParam; + } + + public void setStartupParam(String startupParam) { + this.startupParam = startupParam; + } + public String getCreateTime() { return createTime; } @@ -118,32 +122,77 @@ public void setPartition(String partition) { this.partition = partition; } - public List getRuleTaskDetails() { - return ruleTaskDetails; + public Long getTaskId() { + return taskId; } - public void setRuleTaskDetails(List ruleTaskDetails) { - this.ruleTaskDetails = ruleTaskDetails; + public void setTaskId(Long taskId) { + this.taskId = taskId; } - public String getStartupParam() { - return startupParam; + public String getUser() { + return user; } - public void setStartupParam(String startupParam) { - this.startupParam = startupParam; + public void setUser(String user) { + this.user = user; + } + + public String getDbShare() { + return dbShare; + } + + public void setDbShare(String dbShare) { + this.dbShare = dbShare; + } + + public String getTableShare() { + return tableShare; + } + + public void setTableShare(String tableShare) { + this.tableShare = tableShare; + } + + public String getFilterShare() { + return filterShare; + } + + public void setFilterShare(String filterShare) { + this.filterShare = filterShare; + } + + public String getColumnShare() { + return columnShare; + } + + public void setColumnShare(String columnShare) { + this.columnShare = columnShare; + } + + public List> getConnectShare() { + return connectShare; + } + + public void setConnectShare(List> connectShare) { + this.connectShare = connectShare; } @Override public String toString() { return "DataQualityTask{" + - "applicationId='" + applicationId + '\'' + - ", taskType=" + taskType + + "taskType=" + taskType + + ", applicationId='" + applicationId + '\'' + ", ruleTaskDetails=" + ruleTaskDetails + + ", startupParam='" + startupParam + '\'' + ", createTime='" + createTime + '\'' + ", partition='" + partition + '\'' + ", taskId=" + taskId + ", user='" + user + '\'' + + ", dbShare='" + dbShare + '\'' + + ", tableShare='" + tableShare + '\'' + + ", filterShare='" + filterShare + '\'' + + ", columnShare='" + columnShare + '\'' + '}'; } } diff --git a/core/divider/src/main/java/com/webank/wedatasphere/qualitis/divider/AbstractTaskDivider.java b/core/divider/src/main/java/com/webank/wedatasphere/qualitis/divider/AbstractTaskDivider.java index 6973e4da..2cad4b38 100644 --- a/core/divider/src/main/java/com/webank/wedatasphere/qualitis/divider/AbstractTaskDivider.java +++ b/core/divider/src/main/java/com/webank/wedatasphere/qualitis/divider/AbstractTaskDivider.java @@ -24,6 +24,7 @@ import java.util.Date; import java.util.List; +import java.util.Map; /** * @author howeye @@ -36,14 +37,19 @@ public abstract class AbstractTaskDivider { * @param createTime * @param partition * @param date - * @param database + * @param databaseMap + * @param dataSourceMysqlConnect * @param user * @param threshold + * @param splitBy + * @param startupParam * @return * @throws ArgumentException * @throws UnExpectedRequestException * @throws MetaDataAcquireFailedException */ public abstract List divide(List rules, String applicationId, String createTime, String partition, Date date, - String database, String user, Integer threshold) throws ArgumentException, UnExpectedRequestException, MetaDataAcquireFailedException; + Map> databaseMap, + Map>> dataSourceMysqlConnect, String user, + Integer threshold, String splitBy, String startupParam) throws ArgumentException, UnExpectedRequestException, MetaDataAcquireFailedException; } \ No newline at end of file diff --git a/core/divider/src/main/java/com/webank/wedatasphere/qualitis/divider/SameDataSourceTaskDivider.java b/core/divider/src/main/java/com/webank/wedatasphere/qualitis/divider/SameDataSourceTaskDivider.java index 81fee12e..b9590a12 100644 --- a/core/divider/src/main/java/com/webank/wedatasphere/qualitis/divider/SameDataSourceTaskDivider.java +++ b/core/divider/src/main/java/com/webank/wedatasphere/qualitis/divider/SameDataSourceTaskDivider.java @@ -18,139 +18,211 @@ import com.webank.wedatasphere.qualitis.bean.DataQualityTask; import com.webank.wedatasphere.qualitis.bean.RuleTaskDetail; -import com.webank.wedatasphere.qualitis.dao.UserDao; import com.webank.wedatasphere.qualitis.exception.ArgumentException; +import com.webank.wedatasphere.qualitis.rule.dao.ExecutionParametersDao; import com.webank.wedatasphere.qualitis.rule.entity.Rule; import com.webank.wedatasphere.qualitis.rule.entity.RuleDataSource; +import com.webank.wedatasphere.qualitis.rule.entity.RuleDataSourceEnv; +import com.webank.wedatasphere.qualitis.scheduled.constant.RuleTypeEnum; +import com.webank.wedatasphere.qualitis.util.SpringContextHolder; import java.util.ArrayList; import java.util.Date; import java.util.HashMap; import java.util.List; import java.util.Map; -import java.util.UUID; import java.util.stream.Collectors; import org.apache.commons.collections.CollectionUtils; import org.apache.commons.lang.StringUtils; -import org.apache.commons.lang3.time.FastDateFormat; import org.slf4j.Logger; import org.slf4j.LoggerFactory; -import org.springframework.beans.factory.annotation.Autowired; /** * Divided rule into same task if they have the same datasource * @author howeye */ public class SameDataSourceTaskDivider extends AbstractTaskDivider { - @Autowired - private UserDao userDao; - - private static final FastDateFormat TASK_TIME_FORMAT = FastDateFormat.getInstance("yyyyMMddHHmmss"); private static final Logger LOGGER = LoggerFactory.getLogger(SameDataSourceTaskDivider.class); + private static final String SPLIT_BY_TABLE = "table"; + private static final String SPLIT_BY_NONE = "merge"; + + @Override - public List divide(List rules, String applicationId, String createTime, String partition, Date date, String database, String user - , Integer threshold) - throws ArgumentException { + public List divide(List rules, String applicationId, String createTime, String partition, Date date, Map> ruleReplaceInfo + , Map>> dataSourceMysqlConnect, String user, Integer threshold, String splitBy, String startupParam) throws ArgumentException { LOGGER.info("Start to classify rules by datasource"); Map> sameDataSourceRule = new HashMap<>(4); + Map keyUsers = new HashMap<>(2); + StringBuilder columns = new StringBuilder(); for (Rule rule : rules) { - String key = getKey(rule, user); - // Rules without specific execution parameters can be split into the same task, and rules with execution parameters must be treated as a separate task. - Boolean specifyStaticStartupParam = (rule.getSpecifyStaticStartupParam() != null && rule.getSpecifyStaticStartupParam()); - if (sameDataSourceRule.containsKey(key) && ! specifyStaticStartupParam) { + StringBuilder realUser = new StringBuilder(); + if (StringUtils.isEmpty(splitBy) && StringUtils.isNotEmpty(rule.getExecutionParametersName())) { + String concurrentcyGranularity = SpringContextHolder.getBean(ExecutionParametersDao.class).findByNameAndProjectId(rule.getExecutionParametersName(), rule.getProject().getId()).getConcurrencyGranularity(); + if (StringUtils.isNotEmpty(concurrentcyGranularity) && concurrentcyGranularity.contains(":")) { + splitBy = concurrentcyGranularity.split(":")[1]; + } + } + String key = getKey(rule, user, realUser, partition, splitBy, columns); + + if (ruleReplaceInfo.get(rule.getId()).get("qualitis_startup_param") != null && StringUtils.isNotEmpty((String) ruleReplaceInfo.get(rule.getId()).get("qualitis_startup_param"))) { + key = key + ":" + ruleReplaceInfo.get(rule.getId()).get("qualitis_startup_param"); + } + + if (sameDataSourceRule.containsKey(key)) { sameDataSourceRule.get(key).add(rule); - } else if (specifyStaticStartupParam) { - List tmp = new ArrayList<>(); - tmp.add(rule); - sameDataSourceRule.put(UUID.randomUUID().toString().replace("-", "") + "." + key, tmp); } else { List tmp = new ArrayList<>(); tmp.add(rule); sameDataSourceRule.put(key, tmp); + keyUsers.put(key, realUser.toString()); } } - LOGGER.info("Succeed to classify rules by datasource. Result: {}", sameDataSourceRule); + LOGGER.info("Succeed to classify rules by datasource maybe contains static params. Result: {}", sameDataSourceRule.keySet().stream().collect(Collectors.joining(","))); + List result = new ArrayList<>(); + handleSameDataSourceRule(applicationId, createTime, user, keyUsers, partition, ruleReplaceInfo, dataSourceMysqlConnect, threshold, sameDataSourceRule, result, startupParam, columns.toString()); + LOGGER.info("Succeed to divide all rules into tasks. Result: {}", result); + return result; + } + + private void handleSameDataSourceRule(String applicationId, String createTime, String user + , Map keyUsers, String partition, Map> ruleReplaceInfo + , Map>> dataSourceMysqlConnect, Integer threshold, Map> sameDataSourceRule, List result, String startupParam, String columns) throws ArgumentException { + for (String key : sameDataSourceRule.keySet()) { List ruleList = sameDataSourceRule.get(key); - String ruleStartup = ruleList.stream().map(Rule::getStaticStartupParam) - .filter(staticStartupParam -> StringUtils.isNotBlank(staticStartupParam)) - .collect(Collectors.joining()); + + Rule currentRule = ruleList.iterator().next(); + RuleDataSource currentRuleDataSource = currentRule.getRuleDataSources().iterator().next(); + + StringBuilder dynamicParam = new StringBuilder(); + + Map info = ruleReplaceInfo.get(currentRule.getId()); + + if (info != null && info.keySet().contains("qualitis_startup_param")) { + dynamicParam.append((String) info.get("qualitis_startup_param")); + } + List ruleIdList = ruleList.stream().map(Rule::getId).collect(Collectors.toList()); LOGGER.info("Start to divide rules: {} into a task.", ruleIdList); LOGGER.info("Start to divide rules. Key: {}", key); - String[] keys = key.split("\\."); - String proxyUser = keys[keys.length - 1]; - + String proxyUser = keyUsers.get(key); + LOGGER.info("Divide rules executed by {}", proxyUser); List ruleTaskDetails = new ArrayList<>(); - if (StringUtils.isNotBlank(proxyUser) && database.contains("_ind")) { - database = proxyUser.concat("_ind"); - } + for (Rule rule : ruleList) { String tableName = generateTable(rule); + String database = (String) ruleReplaceInfo.get(rule.getId()).get("qualitis_abnormal_database"); + + if (database.equals(user.concat("_ind")) && StringUtils.isNotBlank(proxyUser) && database.contains("_ind")) { + database = proxyUser.concat("_ind"); + } + String midTableName = database + "." + tableName; + LOGGER.info("Rule detail list size is: {}", ruleTaskDetails.size()); if (ruleTaskDetails.size() < threshold) { - ruleTaskDetails.add(new RuleTaskDetail(rule, midTableName)); + LOGGER.info("Adding rules in rule detail list"); } else { List ruleTaskDetailCopy = new ArrayList<>(); ruleTaskDetailCopy.addAll(ruleTaskDetails); DataQualityTask tmp = new DataQualityTask(applicationId, createTime, partition, ruleTaskDetailCopy); - if (StringUtils.isNotBlank(ruleStartup)) { - tmp.setStartupParam(ruleStartup); - } - if (StringUtils.isNotBlank(proxyUser)) { - LOGGER.info("Start to divide rules. Proxy user: {}", proxyUser); - tmp.setUser(proxyUser); - } + checkAndSaveStartupParamAndShareData(tmp, dynamicParam, startupParam, proxyUser, key, partition, currentRuleDataSource, dataSourceMysqlConnect, columns); + result.add(tmp); ruleTaskDetails = new ArrayList<>(); + LOGGER.info("Create new rule detail list"); } + ruleTaskDetails.add(new RuleTaskDetail(rule, midTableName)); } if (ruleTaskDetails.size() > 0) { DataQualityTask tmp = new DataQualityTask(applicationId, createTime, partition, ruleTaskDetails); - if (StringUtils.isNotBlank(ruleStartup)) { - tmp.setStartupParam(ruleStartup); - } - if (StringUtils.isNotBlank(proxyUser)) { - tmp.setUser(proxyUser); - } + checkAndSaveStartupParamAndShareData(tmp, dynamicParam, startupParam, proxyUser, key, partition, currentRuleDataSource, dataSourceMysqlConnect, columns); + result.add(tmp); LOGGER.info("Succeed to divide rules: {} into a task {}", ruleIdList, tmp); } } - LOGGER.info("Succeed to divide all rules into tasks. result: {}", result); - return result; + } + + private void checkAndSaveStartupParamAndShareData(DataQualityTask tmp, StringBuilder dynamicParam, String startupParam, String proxyUser + , String key, String partition, RuleDataSource ruleDataSource, Map>> dataSourceMysqlConnect, String columns) { + // 调整优先级,页面表单提交的参数>执行参数模板配置的动态引擎配置 + if (StringUtils.isNotBlank(dynamicParam.toString())) { + tmp.setStartupParam(dynamicParam.toString()); + } + + if (StringUtils.isNotBlank(startupParam)) { + tmp.setStartupParam(startupParam); + } + + if (StringUtils.isNotBlank(proxyUser)) { + LOGGER.info("Start to divide rules. Proxy user: {}", proxyUser); + tmp.setUser(proxyUser); + } + if (StringUtils.isEmpty(partition)) { + partition = ruleDataSource.getFilter(); + } + String partOfKey = ruleDataSource.getClusterName() + "." + ruleDataSource.getDbName() + "." + ruleDataSource.getTableName() + "." + partition; + if (StringUtils.isEmpty(ruleDataSource.getFileId()) && StringUtils.isEmpty(ruleDataSource.getFileHashValue()) && key.contains(partOfKey)) { + LOGGER.info("A merge data quality task, should set share data"); + tmp.setColumnShare(columns); + tmp.setFilterShare(partition); + tmp.setDbShare(ruleDataSource.getDbName()); + tmp.setTableShare(ruleDataSource.getTableName()); + if (dataSourceMysqlConnect != null && dataSourceMysqlConnect.get(ruleDataSource.getId()) != null) { + tmp.setConnectShare(dataSourceMysqlConnect.get(ruleDataSource.getId())); + } + } } private String generateTable(Rule rule) { - StringBuffer name = new StringBuffer(); + StringBuilder name = new StringBuilder(); name.append(rule.getProject().getName()).append("_") .append(rule.getName()); return name.toString(); } - private String getKey(Rule rule, String user) throws ArgumentException { - if (rule.getRuleDataSources().size() != 0) { - List ruleDataSourceList = rule.getRuleDataSources().stream().filter(dataSource -> StringUtils.isNotBlank(dataSource.getDbName())).collect( - Collectors.toList()); - RuleDataSource ruleDataSource; - if (CollectionUtils.isNotEmpty(ruleDataSourceList)) { - ruleDataSource = ruleDataSourceList.iterator().next(); - } else { - ruleDataSource = rule.getRuleDataSources().iterator().next(); - } + private String getKey(Rule rule, String user, StringBuilder realUser, String partition, String splitBy, StringBuilder columns) { + List ruleDataSourceList = rule.getRuleDataSources().stream().filter(dataSource -> StringUtils.isNotBlank(dataSource.getDbName()) && StringUtils.isNotBlank(dataSource.getTableName())).collect(Collectors.toList()); + + if (CollectionUtils.isNotEmpty(ruleDataSourceList)) { + RuleDataSource ruleDataSource = ruleDataSourceList.iterator().next(); String proxyUser = ruleDataSource.getProxyUser(); if (StringUtils.isNotBlank(proxyUser)) { - return ruleDataSource.getClusterName() + "." + ruleDataSource.getDbName() + "." + proxyUser; + user = proxyUser; } - return ruleDataSource.getClusterName() + "." + ruleDataSource.getDbName() + "." + user; + realUser.append(user); + if (RuleTypeEnum.MULTI_TEMPLATE_RULE.getCode().equals(rule.getRuleType()) || RuleTypeEnum.CUSTOM_RULE.getCode().equals(rule.getRuleType())) { + String dataSourceKey = ruleDataSource.getClusterName() + "." + user; + return rule.getId() + "." + dataSourceKey; + } else { + if (StringUtils.isEmpty(partition)) { + partition = ruleDataSource.getFilter(); + } + if (StringUtils.isNotEmpty(ruleDataSource.getColName())) { + columns.append(ruleDataSource.getColName()).append("|"); + } + String envNames = "."; + List ruleDataSourceEnvs = ruleDataSource.getRuleDataSourceEnvs(); + if (CollectionUtils.isNotEmpty(ruleDataSourceEnvs)) { + envNames = envNames + ruleDataSourceEnvs.stream().map(ruleDataSourceEnv -> ruleDataSourceEnv.getEnvName()).collect(Collectors.joining("_")) + envNames; + } + if (SPLIT_BY_NONE.equals(splitBy)) { + return ruleDataSource.getClusterName() + "." + ruleDataSource.getDbName() + "." + ruleDataSource.getTableName() + "." + partition + envNames + proxyUser; + } + if (SPLIT_BY_TABLE.equals(splitBy)) { + return ruleDataSource.getClusterName() + "." + ruleDataSource.getDbName() + "." + ruleDataSource.getTableName() + "." + proxyUser; + } + return ruleDataSource.getClusterName() + "." + ruleDataSource.getDbName() + "." + proxyUser; + } + } else { + realUser.append(user); + return rule.getId().toString(); } - - throw new ArgumentException("Error! Rule variables miss data"); } } diff --git a/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/RequestLinkis.java b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/RequestLinkis.java new file mode 100644 index 00000000..b90eaa5d --- /dev/null +++ b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/RequestLinkis.java @@ -0,0 +1,276 @@ +package com.webank.wedatasphere.qualitis.client; + +import com.google.common.collect.Maps; +import com.webank.wedatasphere.qualitis.client.request.AskLinkisParameter; +import com.webank.wedatasphere.qualitis.config.LinkisConfig; +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; +import com.webank.wedatasphere.qualitis.metadata.exception.MetaDataAcquireFailedException; +import org.apache.commons.lang.StringUtils; +import org.apache.logging.log4j.util.Strings; +import org.codehaus.jackson.map.ObjectMapper; +import org.json.JSONArray; +import org.json.JSONObject; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.beans.factory.annotation.Qualifier; +import org.springframework.http.HttpEntity; +import org.springframework.http.HttpHeaders; +import org.springframework.http.HttpMethod; +import org.springframework.http.MediaType; +import org.springframework.retry.RetryContext; +import org.springframework.retry.backoff.FixedBackOffPolicy; +import org.springframework.retry.policy.SimpleRetryPolicy; +import org.springframework.retry.support.RetryTemplate; +import org.springframework.stereotype.Component; +import org.springframework.web.client.ResourceAccessException; +import org.springframework.web.client.RestTemplate; + +import java.io.IOException; +import java.util.Map; +import java.util.UUID; + +/** + * @author v_gaojiedeng + */ +@Component +public class RequestLinkis { + + private static final Logger LOGGER = LoggerFactory.getLogger(RequestLinkis.class); + + private static final String STATUS = "status"; + + @Autowired + private LinkisConfig linkisConfig; + + @Autowired + @Qualifier("linkisRestTemplate") + private RestTemplate linkisRestTemplate; + + public Map getLinkisResponseByGet(AskLinkisParameter askLinkisParameter) throws UnExpectedRequestException, MetaDataAcquireFailedException { + HttpEntity entity = createHttpEntity(askLinkisParameter); + LOGGER.info("Start to " + askLinkisParameter.getLogmessage() + " url: {}, method: {}, body: {}", askLinkisParameter.getUrl(), javax.ws.rs.HttpMethod.GET, entity); + Map response; + try { + response = linkisRestTemplate.exchange(askLinkisParameter.getUrl(), HttpMethod.GET, entity, Map.class).getBody(); + } catch (ResourceAccessException e) { + return extractExceptionMessage(askLinkisParameter, e); + } + return finishLog(askLinkisParameter, response); + } + + public Map removeLinkisResponseByDelete(AskLinkisParameter askLinkisParameter) throws MetaDataAcquireFailedException { + HttpEntity entity = createHttpEntity(askLinkisParameter); + LOGGER.info("Start to " + askLinkisParameter.getLogmessage() + " url: {}, method: {}, body: {}", askLinkisParameter.getUrl(), javax.ws.rs.HttpMethod.GET, entity); + Map response; + try { + response = linkisRestTemplate.exchange(askLinkisParameter.getUrl(), HttpMethod.DELETE, entity, Map.class).getBody(); + } catch (ResourceAccessException e) { + return extractExceptionMessage(askLinkisParameter, e); + } + return finishLog(askLinkisParameter, response); + } + + public Map getLinkisResponseByGetRetry(AskLinkisParameter askLinkisParameter) throws Exception { + HttpEntity entity = createHttpEntity(askLinkisParameter); + LOGGER.info("Start to " + askLinkisParameter.getLogmessage() + " url: {}, method: {}, body: {}", askLinkisParameter.getUrl(), javax.ws.rs.HttpMethod.GET, entity); + SimpleRetryPolicy retryPolicy = new SimpleRetryPolicy(); + retryPolicy.setMaxAttempts(linkisConfig.getRestTemplateMaxAttempt()); + + FixedBackOffPolicy backOffPolicy = new FixedBackOffPolicy(); + backOffPolicy.setBackOffPeriod(linkisConfig.getRetryTimeInterval()); + + RetryTemplate template = new RetryTemplate(); + template.setRetryPolicy(retryPolicy); + template.setBackOffPolicy(backOffPolicy); + + Map response = template.execute(context -> { + Map result = linkisRestTemplate.exchange(askLinkisParameter.getUrl(), HttpMethod.GET, entity, Map.class).getBody(); + LOGGER.info("第三方接口重试第" + context.getRetryCount() + "次调用结果:{}", new JSONObject(result)); + if (null == result.get(STATUS) || ((Integer) result.get(STATUS)).intValue() != 0) { + throw new Exception("调用接口失败,status code: " + result.get("status")); + } + return result; + }, context -> { + LOGGER.info("第三方接口重试调用兜底方法执行,已经重试了" + context.getRetryCount() + "次"); + String message = extractMessage(context); + Map map = Maps.newHashMapWithExpectedSize(1); + map.put("message", message); + return map; + }); + + return finishLog(askLinkisParameter, response); + } + + private String extractMessage(RetryContext context) { + if (context == null || context.getLastThrowable() == null) { + return Strings.EMPTY; + } + if (context.getLastThrowable().getCause() != null) { + String messageJson = context.getLastThrowable().getCause().getMessage(); + if (StringUtils.isNotEmpty(messageJson)) { + try { + Map msgMap = objectMapper.readValue(messageJson, Map.class); + if (msgMap.containsKey("message")) { + return msgMap.get("message"); + } + } catch (IOException e) { + LOGGER.error(e.getMessage(), e); + } + } + } + return context.getLastThrowable().getMessage() == null ? Strings.EMPTY : context.getLastThrowable().getMessage(); + } + + public Map getLinkisResponseByPost(AskLinkisParameter askLinkisParameter) throws UnExpectedRequestException, MetaDataAcquireFailedException { + HttpEntity entity = createHttpEntity(askLinkisParameter); + return getStringObjectMap(askLinkisParameter, entity); + } + + public Map getLinkisResponseByPostBringJson(AskLinkisParameter askLinkisParameter, JSONObject jsonObject) throws UnExpectedRequestException, MetaDataAcquireFailedException { + HttpEntity entity = createHttpEntityBringJson(askLinkisParameter, jsonObject); + return getStringObjectMap(askLinkisParameter, entity); + } + + public Map getLinkisResponseByPostBringJsonArray(AskLinkisParameter askLinkisParameter, JSONArray jsonArray) throws UnExpectedRequestException, MetaDataAcquireFailedException { + HttpEntity entity = createHttpEntityBringJsonArray(askLinkisParameter, jsonArray); + return getStringObjectMap(askLinkisParameter, entity); + } + + + public Map getLinkisResponseByPostBringJsonRetry(AskLinkisParameter askLinkisParameter, JSONObject jsonObject) throws Exception { + HttpEntity entity = createHttpEntityBringJson(askLinkisParameter,jsonObject); + LOGGER.info("Start to " + askLinkisParameter.getLogmessage() + " url: {}, method: {}, body: {}", askLinkisParameter.getUrl(), javax.ws.rs.HttpMethod.POST, entity); + SimpleRetryPolicy retryPolicy = new SimpleRetryPolicy(); + retryPolicy.setMaxAttempts(linkisConfig.getRestTemplateMaxAttempt()); + + FixedBackOffPolicy backOffPolicy = new FixedBackOffPolicy(); + backOffPolicy.setBackOffPeriod(linkisConfig.getRetryTimeInterval()); + + RetryTemplate template = new RetryTemplate(); + template.setRetryPolicy(retryPolicy); + template.setBackOffPolicy(backOffPolicy); + + Map response = template.execute(context -> { + Map result = linkisRestTemplate.exchange(askLinkisParameter.getUrl(), HttpMethod.POST, entity, Map.class).getBody(); + LOGGER.info("第三方接口重试第" + context.getRetryCount() + "次调用结果:{}", new JSONObject(result)); + if (null == result.get(STATUS) || ((Integer) result.get(STATUS)).intValue() != 0) { + throw new Exception("调用接口失败,status code: " + result.get("status")); + } + return result; + }, context -> { + LOGGER.info("第三方接口重试调用兜底方法执行,已经重试了" + context.getRetryCount() + "次"); + String message = extractMessage(context); + Map map = Maps.newHashMapWithExpectedSize(1); + map.put("message", message); + return map; + }); + + return finishLog(askLinkisParameter, response); + } + + + private Map getStringObjectMap(AskLinkisParameter askLinkisParameter, HttpEntity entity) throws MetaDataAcquireFailedException { + LOGGER.info("Start to " + askLinkisParameter.getLogmessage() + " url: {}, method: {}, body: {}", askLinkisParameter.getUrl(), javax.ws.rs.HttpMethod.POST, entity); + Map response = null; + try { + response = linkisRestTemplate.exchange(askLinkisParameter.getUrl(), HttpMethod.POST, entity, Map.class).getBody(); + } catch (ResourceAccessException e) { + return extractExceptionMessage(askLinkisParameter, e); + } + return finishLog(askLinkisParameter, response); + } + + private ObjectMapper objectMapper = new ObjectMapper(); + + private Map extractExceptionMessage(AskLinkisParameter askLinkisParameter, ResourceAccessException e) throws MetaDataAcquireFailedException { + Map response = null; +// From LinkisErrorHandler + String originalMessage = e.getCause().getMessage(); + if (StringUtils.isNotEmpty(originalMessage)) { + try { + response = objectMapper.readValue(originalMessage, Map.class); + } catch (IOException ex) { + LOGGER.error(ex.getMessage(), ex); + } + if (null != response) { + return finishLog(askLinkisParameter, response); + } + } + LOGGER.error("Error! ", e.getMessage(), e); + throw new MetaDataAcquireFailedException("Error! Can not " + askLinkisParameter.getLogmessage() + ", exception: " + e.getMessage(), 500); + } + + public Map getLinkisResponseByPut(AskLinkisParameter askLinkisParameter) throws UnExpectedRequestException, MetaDataAcquireFailedException { + HttpEntity entity = createHttpEntity(askLinkisParameter); + return getMapForPutBringJson(askLinkisParameter, entity); + } + + private Map getMapForPutBringJson(AskLinkisParameter askLinkisParameter, HttpEntity entity) throws MetaDataAcquireFailedException { + LOGGER.info("Start to " + askLinkisParameter.getLogmessage() + " url: {}, method: {}, body: {}", askLinkisParameter.getUrl(), javax.ws.rs.HttpMethod.PUT, entity); + Map response = null; + try { + response = linkisRestTemplate.exchange(askLinkisParameter.getUrl(), HttpMethod.PUT, entity, Map.class).getBody(); + } catch (ResourceAccessException e) { + return extractExceptionMessage(askLinkisParameter, e); + } + return finishLog(askLinkisParameter, response); + } + + public Map getLinkisResponseByPutBringJson(AskLinkisParameter askLinkisParameter, JSONObject jsonObject) throws UnExpectedRequestException, MetaDataAcquireFailedException { + HttpEntity entity = createHttpEntityBringJson(askLinkisParameter,jsonObject); + return getMapForPutBringJson(askLinkisParameter, entity); + } + + private boolean checkResponse(Map response) { + if (null == response.get(STATUS)) { + return false; + } + Integer responseStatus = (Integer) response.get(STATUS); + return responseStatus == 0; + } + + private HttpEntity createHttpEntity(AskLinkisParameter askLinkisParameter) { + HttpHeaders headers = new HttpHeaders(); + headers.setContentType(MediaType.APPLICATION_JSON); + headers.add("Token-User", askLinkisParameter.getAuthUser()); + headers.add("Token-Code", askLinkisParameter.getLinkisToken()); + return new HttpEntity<>(headers); + } + + private HttpEntity createHttpEntityBringJson(AskLinkisParameter askLinkisParameter, JSONObject jsonObject) { + HttpHeaders headers = new HttpHeaders(); + headers.setContentType(MediaType.APPLICATION_JSON); + headers.add("Token-User", askLinkisParameter.getAuthUser()); + headers.add("Token-Code", askLinkisParameter.getLinkisToken()); + return new HttpEntity<>(jsonObject.toString(), headers); + } + + private HttpEntity createHttpEntityBringJsonArray(AskLinkisParameter askLinkisParameter, JSONArray jsonArray) { + HttpHeaders headers = new HttpHeaders(); + headers.setContentType(MediaType.APPLICATION_JSON); + headers.add("Token-User", askLinkisParameter.getAuthUser()); + headers.add("Token-Code", askLinkisParameter.getLinkisToken()); + return new HttpEntity<>(jsonArray.toString(), headers); + } + + private Map finishLog(AskLinkisParameter askLinkisParameter, Map response) throws MetaDataAcquireFailedException { + String traceId = StringUtils.replace(UUID.randomUUID().toString(), "-", ""); + LOGGER.info("traceId: {} Finished to {}, url: {}, authUser: {}, response: {}", traceId, askLinkisParameter.getLogmessage(), askLinkisParameter.getUrl(), askLinkisParameter.getAuthUser(), response); + if (!checkResponse(response)) { + String content = null; + if (response.containsKey("message")) { + content = response.get("message").toString(); + } + String errorMsg = String.format("Error! Can not get meta data from linkis, traceId: %s, authUser: %s, exception: %s", traceId, askLinkisParameter.getAuthUser(), content); + LOGGER.error(errorMsg); + throw new MetaDataAcquireFailedException(content); + } + return response; + } + + public Map getLinkisResponseByPutBringJsonArray(AskLinkisParameter askLinkisParameter, JSONArray jsonArray) throws MetaDataAcquireFailedException { + HttpEntity entity = createHttpEntityBringJsonArray(askLinkisParameter, jsonArray); + return getMapForPutBringJson(askLinkisParameter, entity); + } +} diff --git a/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/config/DataMapConfig.java b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/config/DataMapConfig.java new file mode 100644 index 00000000..2f3a192e --- /dev/null +++ b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/config/DataMapConfig.java @@ -0,0 +1,203 @@ +package com.webank.wedatasphere.qualitis.client.config; + +import org.springframework.beans.factory.annotation.Value; +import org.springframework.context.annotation.Configuration; + +/** + * @author allenzhou@webank.com + * @date 2021/6/15 11:35 + */ +@Configuration +public class DataMapConfig { + @Value("${datamap.isolateEnvFlag}") + private String isolateEnvFlag; + @Value("${datamap.address}") + private String address; + @Value("${datamap.dbs_path}") + private String databasePath; + @Value("${datamap.tables_path}") + private String tablePath; + @Value("${datamap.columns_path}") + private String columnPath; + @Value("${datamap.standard_path}") + private String standardPath; + @Value("${datamap.query_all_path}") + private String queryAllPath; + @Value("${datamap.dataset_tag_relations_path}") + private String datasetTagRelationsPath; + @Value("${datamap.tags_path}") + private String tagsPath; + @Value("${datamap.app_id}") + private String appId; + @Value("${datamap.app_token}") + private String appToken; + @Value("${datamap.user_id}") + private String userId; + @Value("${datamap.random_hash_salt}") + private String randomHashSalt; + @Value("${datamap.data_standard_urn_path}") + private String dataStandardUrnPath; + @Value("${datamap.data_standard_code_path}") + private String dataStandardCodePath; + @Value("${datamap.data_standard_code_table}") + private String dataStandardCodeTable; + + @Value("${datamap.data_standard_category}") + private String dataStandardCategory; + @Value("${datamap.data_standard_big_category}") + private String dataStandardBigCategory; + @Value("${datamap.data_standard_small_category}") + private String dataStandardSmallCategory; + + public String getIsolateEnvFlag() { + return isolateEnvFlag; + } + + public void setIsolateEnvFlag(String isolateEnvFlag) { + this.isolateEnvFlag = isolateEnvFlag; + } + + public String getAddress() { + return address; + } + + public void setAddress(String address) { + this.address = address; + } + + public String getDatabasePath() { + return databasePath; + } + + public void setDatabasePath(String databasePath) { + this.databasePath = databasePath; + } + + public String getTablePath() { + return tablePath; + } + + public void setTablePath(String tablePath) { + this.tablePath = tablePath; + } + + public String getColumnPath() { + return columnPath; + } + + public void setColumnPath(String columnPath) { + this.columnPath = columnPath; + } + + public String getStandardPath() { + return standardPath; + } + + public void setStandardPath(String standardPath) { + this.standardPath = standardPath; + } + + public String getQueryAllPath() { + return queryAllPath; + } + + public void setQueryAllPath(String queryAllPath) { + this.queryAllPath = queryAllPath; + } + + public String getDatasetTagRelationsPath() { + return datasetTagRelationsPath; + } + + public void setDatasetTagRelationsPath(String datasetTagRelationsPath) { + this.datasetTagRelationsPath = datasetTagRelationsPath; + } + + public String getTagsPath() { + return tagsPath; + } + + public void setTagsPath(String tagsPath) { + this.tagsPath = tagsPath; + } + + public String getAppId() { + return appId; + } + + public void setAppId(String appId) { + this.appId = appId; + } + + public String getAppToken() { + return appToken; + } + + public void setAppToken(String appToken) { + this.appToken = appToken; + } + + public String getUserId() { + return userId; + } + + public void setUserId(String userId) { + this.userId = userId; + } + + public String getRandomHashSalt() { + return randomHashSalt; + } + + public void setRandomHashSalt(String randomHashSalt) { + this.randomHashSalt = randomHashSalt; + } + + public String getDataStandardUrnPath() { + return dataStandardUrnPath; + } + + public void setDataStandardUrnPath(String dataStandardUrnPath) { + this.dataStandardUrnPath = dataStandardUrnPath; + } + + public String getDataStandardCodePath() { + return dataStandardCodePath; + } + + public void setDataStandardCodePath(String dataStandardCodePath) { + this.dataStandardCodePath = dataStandardCodePath; + } + + public String getDataStandardCategory() { + return dataStandardCategory; + } + + public void setDataStandardCategory(String dataStandardCategory) { + this.dataStandardCategory = dataStandardCategory; + } + + public String getDataStandardBigCategory() { + return dataStandardBigCategory; + } + + public void setDataStandardBigCategory(String dataStandardBigCategory) { + this.dataStandardBigCategory = dataStandardBigCategory; + } + + public String getDataStandardSmallCategory() { + return dataStandardSmallCategory; + } + + public void setDataStandardSmallCategory(String dataStandardSmallCategory) { + this.dataStandardSmallCategory = dataStandardSmallCategory; + } + + public String getDataStandardCodeTable() { + return dataStandardCodeTable; + } + + public void setDataStandardCodeTable(String dataStandardCodeTable) { + this.dataStandardCodeTable = dataStandardCodeTable; + } +} diff --git a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/OptimizationConfig.java b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/config/MetricPropertiesConfig.java similarity index 68% rename from core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/OptimizationConfig.java rename to core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/config/MetricPropertiesConfig.java index ca30c58b..1326c3df 100644 --- a/core/converter/src/main/java/com/webank/wedatasphere/qualitis/config/OptimizationConfig.java +++ b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/config/MetricPropertiesConfig.java @@ -14,7 +14,7 @@ * limitations under the License. */ -package com.webank.wedatasphere.qualitis.config; +package com.webank.wedatasphere.qualitis.client.config; import org.springframework.beans.factory.annotation.Value; import org.springframework.context.annotation.Configuration; @@ -23,15 +23,16 @@ * @author allenzhou */ @Configuration -public class OptimizationConfig { - @Value("${linkis.lightweight_query}") - private Boolean lightweightQuery; +public class MetricPropertiesConfig { - public Boolean getLightweightQuery() { - return lightweightQuery; + @Value("${department.white_list}") + private String whiteList; + + public String getWhiteList() { + return whiteList; } - public void setLightweightQuery(Boolean lightweightQuery) { - this.lightweightQuery = lightweightQuery; + public void setWhiteList(String whiteList) { + this.whiteList = whiteList; } } diff --git a/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/config/OperateCiConfig.java b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/config/OperateCiConfig.java new file mode 100644 index 00000000..1a7ef718 --- /dev/null +++ b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/config/OperateCiConfig.java @@ -0,0 +1,125 @@ +package com.webank.wedatasphere.qualitis.client.config; + +import org.springframework.beans.factory.annotation.Value; +import org.springframework.context.annotation.Configuration; + +/** + * @author allenzhou@webank.com + * @date 2021/3/1 17:56 + */ +@Configuration +public class OperateCiConfig { + @Value("${cmdb.host}") + private String host; + + @Value("${cmdb.url}") + private String url; + + @Value("${cmdb.integrateUrl}") + private String integrateUrl; + + @Value("${cmdb.userAuthKey}") + private String userAuthKey; + + @Value("${cmdb.newUserAuthKey}") + private String newUserAuthKey; + + @Value("${cmdb.onlySlave}") + private Boolean onlySlave; + + @Value("${ef.host}") + private String efHost; + + @Value("${ef.url}") + private String efUrl; + + @Value("${ef.app_id}") + private String efAppId; + + @Value("${ef.app_token}") + private String efAppToken; + + public OperateCiConfig() { + // Do nothing. + } + + public String getUserAuthKey() { + return userAuthKey; + } + + public void setUserAuthKey(String userAuthKey) { + this.userAuthKey = userAuthKey; + } + + public String getHost() { + return host; + } + + public void setHost(String host) { + this.host = host; + } + + public String getUrl() { + return url; + } + + public void setUrl(String url) { + this.url = url; + } + + public String getIntegrateUrl() { + return integrateUrl; + } + + public void setIntegrateUrl(String integrateUrl) { + this.integrateUrl = integrateUrl; + } + + public String getNewUserAuthKey() { + return newUserAuthKey; + } + + public void setNewUserAuthKey(String newUserAuthKey) { + this.newUserAuthKey = newUserAuthKey; + } + + public Boolean getOnlySlave() { + return onlySlave; + } + + public void setOnlySlave(Boolean onlySlave) { + this.onlySlave = onlySlave; + } + + public String getEfHost() { + return efHost; + } + + public void setEfHost(String efHost) { + this.efHost = efHost; + } + + public String getEfUrl() { + return efUrl; + } + + public void setEfUrl(String efUrl) { + this.efUrl = efUrl; + } + + public String getEfAppId() { + return efAppId; + } + + public void setEfAppId(String efAppId) { + this.efAppId = efAppId; + } + + public String getEfAppToken() { + return efAppToken; + } + + public void setEfAppToken(String efAppToken) { + this.efAppToken = efAppToken; + } +} diff --git a/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/constant/OperateEnum.java b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/constant/OperateEnum.java new file mode 100644 index 00000000..959e91bf --- /dev/null +++ b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/constant/OperateEnum.java @@ -0,0 +1,34 @@ +package com.webank.wedatasphere.qualitis.client.constant; + +/** + * @author allenzhou@webank.com + * @date 2021/3/3 10:41 + */ +public enum OperateEnum { + /** + * 消息类型 + */ + SUB_SYSTEM(1, "子系统信息请求"), + PRODUCT(2, "产品信息请求"), + DEPARTMENT(3, "部门信息请求"), + DEV_DEPARTMENT(4, "开发部门信息请求"), + OPS_DEPARTMENT(5, "运维部门信息请求"), + SUB_SYSTEM_FIND_DCN(6, "子系统查询DCN") + ; + + private int code; + private String message; + + OperateEnum(int code, String message) { + this.code = code; + this.message = message; + } + + public int getCode() { + return code; + } + + public String getMessage() { + return message; + } +} diff --git a/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/DataStandardClientImpl.java b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/DataStandardClientImpl.java new file mode 100644 index 00000000..8717182e --- /dev/null +++ b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/DataStandardClientImpl.java @@ -0,0 +1,376 @@ +package com.webank.wedatasphere.qualitis.client.impl; + +import com.google.gson.Gson; +import com.webank.wedatasphere.qualitis.client.config.DataMapConfig; +import com.webank.wedatasphere.qualitis.dao.ClusterInfoDao; +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; +import com.webank.wedatasphere.qualitis.metadata.client.DataStandardClient; +import com.webank.wedatasphere.qualitis.metadata.constant.DataMapResponseKeyEnum; +import com.webank.wedatasphere.qualitis.metadata.exception.MetaDataAcquireFailedException; +import com.webank.wedatasphere.qualitis.util.UuidGenerator; +import org.apache.commons.lang.StringUtils; +import org.apache.http.HttpStatus; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.http.HttpEntity; +import org.springframework.http.HttpHeaders; +import org.springframework.http.HttpMethod; +import org.springframework.stereotype.Component; +import org.springframework.web.client.RestTemplate; + +import javax.ws.rs.core.UriBuilder; +import javax.xml.bind.DatatypeConverter; +import java.io.UnsupportedEncodingException; +import java.net.URI; +import java.net.URISyntaxException; +import java.net.URLDecoder; +import java.security.MessageDigest; +import java.security.NoSuchAlgorithmException; +import java.util.Map; + +/** + * @author allenzhou@webank.com + * @date 2021/6/15 15:29 + */ +@Component +public class DataStandardClientImpl implements DataStandardClient { + @Autowired + private RestTemplate restTemplate; + @Autowired + private DataMapConfig dataMapConfig; + @Autowired + private ClusterInfoDao clusterInfoDao; + + private static final Logger LOGGER = LoggerFactory.getLogger(DataStandardClientImpl.class); + + @Override + public Map getDatabase(String searchKey, String loginUser) throws UnExpectedRequestException, MetaDataAcquireFailedException { + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getDatabasePath()) + .queryParam("searchKey", searchKey) + .queryParam("isolateEnvFlag", dataMapConfig.getIsolateEnvFlag()); + String url = constructUrlWithSignature(uriBuilder, loginUser); + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + + HttpEntity entity = new HttpEntity<>(headers); + LOGGER.info("Start to get db by datamap. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + String responseStr = restTemplate.exchange(url, HttpMethod.GET, entity, String.class).getBody(); + LOGGER.info("Finish to get db by datamap. response: {}", responseStr); + Map response = new Gson().fromJson(responseStr, Map.class); + if (HttpStatus.SC_OK != Integer.parseInt((String) response.get(DataMapResponseKeyEnum.CODE.getKey()))) { + String message = (String) response.get("msg"); + throw new MetaDataAcquireFailedException("Error! Can not get meta data from DataMap, message: " + message, 403); + } + + return (Map) response.get(DataMapResponseKeyEnum.DATA.getKey()); + } + + @Override + public Map getDataset(String dbId, String datasetName, int page, int size, String loginUser) + throws UnExpectedRequestException, MetaDataAcquireFailedException { + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getTablePath()) + .queryParam("isolateEnvFlag", dataMapConfig.getIsolateEnvFlag()) + .queryParam("dbId", dbId).queryParam("datasetName", datasetName) + .queryParam("pageNum", page).queryParam("pageSize", size); + String url = constructUrlWithSignature(uriBuilder, loginUser); + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + + HttpEntity entity = new HttpEntity<>(headers); + LOGGER.info("Start to get table by datamap. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + String responseStr = restTemplate.exchange(url, HttpMethod.GET, entity, String.class).getBody(); + LOGGER.info("Finish to get table by datamap. response: {}", responseStr); + Map response = new Gson().fromJson(responseStr, Map.class); + if (HttpStatus.SC_OK != Integer.parseInt((String) response.get(DataMapResponseKeyEnum.CODE.getKey()))) { + String message = (String) response.get("msg"); + throw new MetaDataAcquireFailedException("Error! Can not get meta data from DataMap, exception: " + message, 200); + } + + return (Map) response.get(DataMapResponseKeyEnum.DATA.getKey()); + } + + @Override + public Map getColumnStandard(Long datasetId, String fieldName, String loginUser) throws UnExpectedRequestException, MetaDataAcquireFailedException { + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getColumnPath()) + .queryParam("isolateEnvFlag", dataMapConfig.getIsolateEnvFlag()) + .queryParam("datasetId", datasetId) + .queryParam("fieldName", fieldName) + .queryParam("pageNum", 0) + .queryParam("pageSize", 1) + .queryParam("comment", "") + .queryParam("contentName", "") + .queryParam("scLevel", ""); + String url = constructUrlWithSignature(uriBuilder, loginUser); + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + + HttpEntity entity = new HttpEntity<>(headers); + LOGGER.info("Start to get field by datamap. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + String responseStr = restTemplate.exchange(url, HttpMethod.GET, entity, String.class).getBody(); + LOGGER.info("Finish to get field by datamap. response: {}", responseStr); + Map response = new Gson().fromJson(responseStr, Map.class); + if (HttpStatus.SC_OK != Integer.parseInt((String) response.get(DataMapResponseKeyEnum.CODE.getKey()))) { + String message = (String) response.get("message"); + throw new MetaDataAcquireFailedException("Error! Can not get meta data from DataMap, exception: " + message, 200); + } + + return (Map) response.get(DataMapResponseKeyEnum.DATA.getKey()); + } + + @Override + public Map getDataStandardDetail(String stdCode, String source, String loginUser) throws UnExpectedRequestException, MetaDataAcquireFailedException { + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getStandardPath()) + .queryParam("isolateEnvFlag", dataMapConfig.getIsolateEnvFlag()) + .queryParam("stdCode", stdCode) + .queryParam("source", source); + String url = constructUrlWithSignature(uriBuilder, loginUser); + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + + HttpEntity entity = new HttpEntity<>(headers); + LOGGER.info("Start to get standard by datamap. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + String responseStr = restTemplate.exchange(url, HttpMethod.GET, entity, String.class).getBody(); + LOGGER.info("Finish to get standard by datamap. response: {}", responseStr); + Map response = new Gson().fromJson(responseStr, Map.class); + if (HttpStatus.SC_OK != Integer.parseInt((String) response.get(DataMapResponseKeyEnum.CODE.getKey()))) { + String message = (String) response.get("message"); + throw new MetaDataAcquireFailedException("Error! Can not get meta data from DataMap, exception: " + message, 200); + } + + return (Map) response.get(DataMapResponseKeyEnum.DATA.getKey()); + } + + @Override + public Map getDataStandardCategory(int page, int size, String loginUser, String stdSubName) throws MetaDataAcquireFailedException, UnExpectedRequestException { + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getDataStandardCategory()) + .queryParam("isolateEnvFlag", dataMapConfig.getIsolateEnvFlag()) + .queryParam("stdSubName", StringUtils.isNotBlank(stdSubName) ? stdSubName : "") + .queryParam("pageNum", page) + .queryParam("pageSize", size); + String url = constructUrlWithSignature(uriBuilder, loginUser); + + try { + url = URLDecoder.decode(url, "UTF-8"); + } catch (UnsupportedEncodingException e) { + LOGGER.error(e.getMessage(), e); + throw new UnExpectedRequestException("Decode get path url exception", 500); + } + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + + HttpEntity entity = new HttpEntity<>(headers); + LOGGER.info("Start to get data standard category by datamap. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + String responseStr = restTemplate.exchange(url, HttpMethod.GET, entity, String.class).getBody(); + LOGGER.info("Finish to get data standard category by datamap. response: {}", responseStr); + Map response = new Gson().fromJson(responseStr, Map.class); + if (HttpStatus.SC_OK != Integer.parseInt((String) response.get(DataMapResponseKeyEnum.CODE.getKey()))) { + String message = (String) response.get("message"); + throw new MetaDataAcquireFailedException("Error! Can not get meta data from DataMap, exception: " + message, 200); + } + + return (Map) response.get(DataMapResponseKeyEnum.DATA.getKey()); + } + + @Override + public Map getDataStandardBigCategory(int page, int size, String loginUser, String stdSubName, String stdBigCategoryName) throws MetaDataAcquireFailedException, UnExpectedRequestException { + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getDataStandardBigCategory()) + .queryParam("isolateEnvFlag", dataMapConfig.getIsolateEnvFlag()) + .queryParam("stdSubName", StringUtils.isNotBlank(stdSubName) ? stdSubName : "") + .queryParam("stdBigCategoryName", StringUtils.isNotBlank(stdBigCategoryName) ? stdBigCategoryName : "") + .queryParam("pageNum", page) + .queryParam("pageSize", size); + String url = constructUrlWithSignature(uriBuilder, loginUser); + + try { + url = URLDecoder.decode(url, "UTF-8"); + } catch (UnsupportedEncodingException e) { + LOGGER.error(e.getMessage(), e); + throw new UnExpectedRequestException("Decode get path url exception", 500); + } + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + + HttpEntity entity = new HttpEntity<>(headers); + LOGGER.info("Start to get data standard big category by datamap. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + String responseStr = restTemplate.exchange(url, HttpMethod.GET, entity, String.class).getBody(); + LOGGER.info("Finish to get data standard big category by datamap. response: {}", responseStr); + Map response = new Gson().fromJson(responseStr, Map.class); + if (HttpStatus.SC_OK != Integer.parseInt((String) response.get(DataMapResponseKeyEnum.CODE.getKey()))) { + String message = (String) response.get("message"); + throw new MetaDataAcquireFailedException("Error! Can not get meta data from DataMap, exception: " + message, 200); + } + + return (Map) response.get(DataMapResponseKeyEnum.DATA.getKey()); + } + + @Override + public Map getDataStandardSmallCategory(int page, int size, String loginUser, String stdSubName, String stdBigCategoryName, String smallCategoryName) throws MetaDataAcquireFailedException, UnExpectedRequestException { + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getDataStandardSmallCategory()) + .queryParam("isolateEnvFlag", dataMapConfig.getIsolateEnvFlag()) + .queryParam("stdSubName", StringUtils.isNotBlank(stdSubName) ? stdSubName : "") + .queryParam("stdBigCategoryName", StringUtils.isNotBlank(stdBigCategoryName) ? stdBigCategoryName : "") + .queryParam("smallCategoryName", StringUtils.isNotBlank(smallCategoryName) ? smallCategoryName : "") + .queryParam("pageNum", page) + .queryParam("pageSize", size); + String url = constructUrlWithSignature(uriBuilder, loginUser); + + try { + url = URLDecoder.decode(url, "UTF-8"); + } catch (UnsupportedEncodingException e) { + LOGGER.error(e.getMessage(), e); + throw new UnExpectedRequestException("Decode get path url exception", 500); + } + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + + HttpEntity entity = new HttpEntity<>(headers); + LOGGER.info("Start to get data standard small category by datamap. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + String responseStr = restTemplate.exchange(url, HttpMethod.GET, entity, String.class).getBody(); + LOGGER.info("Finish to get data standard small category by datamap. response: {}", responseStr); + Map response = new Gson().fromJson(responseStr, Map.class); + if (HttpStatus.SC_OK != Integer.parseInt((String) response.get(DataMapResponseKeyEnum.CODE.getKey()))) { + String message = (String) response.get("message"); + throw new MetaDataAcquireFailedException("Error! Can not get meta data from DataMap, exception: " + message, 200); + } + + return (Map) response.get(DataMapResponseKeyEnum.DATA.getKey()); + } + + @Override + public Map getDataStandard(int page, int size, String loginUser, String stdSmallCategoryUrn, String stdCnName) throws MetaDataAcquireFailedException, UnExpectedRequestException, URISyntaxException { + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getDataStandardUrnPath()) + .queryParam("isolateEnvFlag", dataMapConfig.getIsolateEnvFlag()) + .queryParam("stdSmallCategoryUrn", stdSmallCategoryUrn) + .queryParam("stdCnName", StringUtils.isNotBlank(stdCnName) ? stdCnName : "") + .queryParam("pageNum", page) + .queryParam("pageSize", size); + String url = constructUrlWithSignature(uriBuilder, loginUser); + + String replaceUrl = url.replace("\"", "%22"); + + try { + replaceUrl = URLDecoder.decode(replaceUrl, "UTF-8"); + } catch (UnsupportedEncodingException e) { + LOGGER.error(e.getMessage(), e); + throw new UnExpectedRequestException("Decode get path url exception", 500); + } + + URI uri = new URI(replaceUrl); + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + + HttpEntity entity = new HttpEntity<>(headers); + LOGGER.info("Start to get data standard by datamap. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + String responseStr = restTemplate.exchange(uri, HttpMethod.GET, entity, String.class).getBody(); + LOGGER.info("Finish to get data standard by datamap. response: {}", responseStr); + Map response = new Gson().fromJson(responseStr, Map.class); + if (HttpStatus.SC_OK != Integer.parseInt((String) response.get(DataMapResponseKeyEnum.CODE.getKey()))) { + String message = (String) response.get("message"); + throw new MetaDataAcquireFailedException("Error! Can not get meta data from DataMap, exception: " + message, 200); + } + + return (Map) response.get(DataMapResponseKeyEnum.DATA.getKey()); + } + + @Override + public Map getStandardCode(int page, int size, String loginUser, String stdUrn) throws MetaDataAcquireFailedException, UnExpectedRequestException, URISyntaxException { + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getDataStandardCodePath()) + .queryParam("isolateEnvFlag", dataMapConfig.getIsolateEnvFlag()) + .queryParam("stdUrn", stdUrn) + .queryParam("pageNum", page) + .queryParam("pageSize", size); + String url = constructUrlWithSignature(uriBuilder, loginUser); + String replaceUrl = url.replace("\"", "%22"); + URI uri = new URI(replaceUrl); + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + + HttpEntity entity = new HttpEntity<>(headers); + LOGGER.info("Start to get standard code by datamap. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + String responseStr = restTemplate.exchange(uri, HttpMethod.GET, entity, String.class).getBody(); + LOGGER.info("Finish to get standard code by datamap. response: {}", responseStr); + Map response = new Gson().fromJson(responseStr, Map.class); + if (HttpStatus.SC_OK != Integer.parseInt((String) response.get(DataMapResponseKeyEnum.CODE.getKey()))) { + String message = (String) response.get("message"); + throw new MetaDataAcquireFailedException("Error! Can not get meta data from DataMap, exception: " + message, 200); + } + + return (Map) response.get(DataMapResponseKeyEnum.DATA.getKey()); + } + + @Override + public Map getStandardCodeTable(int page, int size, String loginUser, String stdCode) throws MetaDataAcquireFailedException, UnExpectedRequestException { + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getDataStandardCodeTable()) + .queryParam("stdCode", StringUtils.isNotBlank(stdCode) ? stdCode : "") + .queryParam("pageNum", page) + .queryParam("pageSize", size); + String url = constructUrlWithSignature(uriBuilder, loginUser); + + try { + url = URLDecoder.decode(url, "UTF-8"); + } catch (UnsupportedEncodingException e) { + LOGGER.error(e.getMessage(), e); + throw new UnExpectedRequestException("Decode get path url exception", 500); + } + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + + HttpEntity entity = new HttpEntity<>(headers); + LOGGER.info("Start to get standard code table by datamap. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + String responseStr = restTemplate.exchange(url, HttpMethod.GET, entity, String.class).getBody(); + LOGGER.info("Finish to get standard code table by datamap. response: {}", responseStr); + Map response = new Gson().fromJson(responseStr, Map.class); + if (HttpStatus.SC_OK != Integer.parseInt((String) response.get(DataMapResponseKeyEnum.CODE.getKey()))) { + String message = (String) response.get("message"); + throw new MetaDataAcquireFailedException("Error! Can not get meta data from DataMap, exception: " + message, 200); + } + + return (Map) response.get(DataMapResponseKeyEnum.DATA.getKey()); + } + + private String constructUrlWithSignature(UriBuilder url, String loginUser) throws UnExpectedRequestException { + String nonce = UuidGenerator.generateRandom(5); + String timestamp = String.valueOf(System.currentTimeMillis()); + + String signature = hashWithDataMap(hashWithDataMap(dataMapConfig.getAppId() + nonce + loginUser + timestamp) + dataMapConfig.getAppToken()); + + return url.queryParam("appid", dataMapConfig.getAppId()).queryParam("nonce", nonce).queryParam("timestamp", timestamp) + .queryParam("loginUser", loginUser).queryParam("signature", signature).toString(); + } + + private String hashWithDataMap(String str) throws UnExpectedRequestException { + MessageDigest md; + try { + md = MessageDigest.getInstance("SHA-256"); + } catch (NoSuchAlgorithmException e) { + LOGGER.error(e.getMessage(), e); + throw new UnExpectedRequestException("A error occured when pick up a algorithm of hash to construct datamap http request.", 500); + } + md.update(str.getBytes()); + byte[] digest = md.digest(); + String hashStr = DatatypeConverter.printHexBinary(digest).toLowerCase(); + return hashStr; + } + +} diff --git a/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/LinkisMetaDataManagerImpl.java b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/LinkisMetaDataManagerImpl.java new file mode 100644 index 00000000..4aeaa711 --- /dev/null +++ b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/LinkisMetaDataManagerImpl.java @@ -0,0 +1,286 @@ +package com.webank.wedatasphere.qualitis.client.impl; + +import com.google.common.collect.Maps; +import com.webank.wedatasphere.qualitis.constant.SpecCharEnum; +import com.webank.wedatasphere.qualitis.constants.QualitisConstants; +import com.webank.wedatasphere.qualitis.constants.ResponseStatusConstants; +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; +import com.webank.wedatasphere.qualitis.metadata.client.LinkisMetaDataManager; +import com.webank.wedatasphere.qualitis.metadata.client.MetaDataClient; +import com.webank.wedatasphere.qualitis.metadata.exception.MetaDataAcquireFailedException; +import com.webank.wedatasphere.qualitis.metadata.request.LinkisConnectParamsRequest; +import com.webank.wedatasphere.qualitis.metadata.request.LinkisDataSourceEnvRequest; +import com.webank.wedatasphere.qualitis.metadata.request.LinkisDataSourceRequest; +import com.webank.wedatasphere.qualitis.metadata.request.ModifyDataSourceParameterRequest; +import com.webank.wedatasphere.qualitis.metadata.response.datasource.LinkisDataSourceParamsResponse; +import com.webank.wedatasphere.qualitis.response.GeneralResponse; +import com.webank.wedatasphere.qualitis.util.CryptoUtils; +import org.apache.commons.collections.CollectionUtils; +import org.apache.commons.collections4.MapUtils; +import org.apache.commons.lang.StringUtils; +import org.codehaus.jackson.map.ObjectMapper; +import org.json.JSONException; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.stereotype.Service; + +import java.io.IOException; +import java.util.HashMap; +import java.util.List; +import java.util.Map; +import java.util.Objects; +import java.util.function.Function; +import java.util.stream.Collectors; + +/** + * @author v_minminghe@webank.com + * @date 2023-05-12 10:59 + * @description 封装对metaDataClient的调用,主要用于处理请求参数的包装和转换 + */ +@Service +public class LinkisMetaDataManagerImpl implements LinkisMetaDataManager { + + private static final Logger LOGGER = LoggerFactory.getLogger(LinkisMetaDataManagerImpl.class); + + @Autowired + private MetaDataClient metaDataClient; + + private final ObjectMapper objectMapper = new ObjectMapper(); + + private final String CREATE_SYSTEM = "Qualitis"; + + @Override + public Long createDataSource(LinkisDataSourceRequest linkisDataSourceRequest, String cluster, String authUser) throws UnExpectedRequestException, MetaDataAcquireFailedException { + String dataSourceJson = createDatasourceJson(linkisDataSourceRequest); + GeneralResponse> generalResponse; + try { + generalResponse = metaDataClient.createDataSource(cluster, authUser, dataSourceJson); + } catch (JSONException e) { + throw new UnExpectedRequestException("Failed to format request parameter of dataSource"); + } + Map dataMap = generalResponse.getData(); + if (Objects.nonNull(dataMap) && dataMap.containsKey("insertId")) { + return Long.valueOf((Integer) dataMap.get("insertId")); + } + throw new UnExpectedRequestException("Failed to create DataSource"); + } + + @Override + public Long modifyDataSource(LinkisDataSourceRequest linkisDataSourceRequest, String cluster, String authUser) throws UnExpectedRequestException, MetaDataAcquireFailedException { + String dataSourceJson = createDatasourceJson(linkisDataSourceRequest); + GeneralResponse> generalResponse; + try { + generalResponse = metaDataClient.modifyDataSource(cluster, authUser, linkisDataSourceRequest.getLinkisDataSourceId(), dataSourceJson); + } catch (JSONException e) { + throw new UnExpectedRequestException("Failed to format request parameter of dataSource"); + } + Map dataMap = generalResponse.getData(); + if (Objects.nonNull(dataMap) && dataMap.containsKey("updateId")) { + return Long.valueOf((Integer) dataMap.get("updateId")); + } + throw new UnExpectedRequestException("Failed to modify DataSource"); + } + + @Override + public List createDataSourceEnv(Integer inputType, Integer verifyType, List linkisDataSourceEnvRequestList, String clusterName, String authUser) throws UnExpectedRequestException, MetaDataAcquireFailedException { + String envJson = createDatasourceEnvJson(inputType, verifyType, linkisDataSourceEnvRequestList); + GeneralResponse> datasourceEnvResponse; + try { + LOGGER.info("createDataSourceEnv, request body: {}", envJson); + datasourceEnvResponse = metaDataClient.createDataSourceEnvBatch(clusterName, authUser, CREATE_SYSTEM, envJson); + } catch (JSONException e) { + throw new UnExpectedRequestException("Failed to format request parameter of dataSource env"); + } + if (ResponseStatusConstants.OK.equals(datasourceEnvResponse.getCode()) && Objects.nonNull(datasourceEnvResponse.getData())) { + Map dataMap = datasourceEnvResponse.getData(); + if (dataMap.containsKey("envs")) { + List> envMapList = (List>) dataMap.get("envs"); + Map> nameAndEnvMap = envMapList.stream() + .filter(envMap -> envMap.containsKey("id")) + .collect(Collectors.toMap(envMap -> String.valueOf(envMap.get("envName")), Function.identity(), (oldVal, newVal) -> oldVal)); + linkisDataSourceEnvRequestList.forEach(linkisDataSourceEnvRequest -> { + if (nameAndEnvMap.containsKey(linkisDataSourceEnvRequest.getEnvName())) { + Map envMap = nameAndEnvMap.get(linkisDataSourceEnvRequest.getEnvName()); + if (envMap.containsKey("id")) { + linkisDataSourceEnvRequest.setId(Long.valueOf(envMap.get("id").toString())); + } + } + }); + } + } + return linkisDataSourceEnvRequestList; + } + + @Override + public List modifyDataSourceEnv(Integer inputType, Integer verifyType, List linkisDataSourceEnvRequestList, String clusterName, String authUser) throws UnExpectedRequestException, MetaDataAcquireFailedException { + String envJson = createDatasourceEnvJson(inputType, verifyType, linkisDataSourceEnvRequestList); + GeneralResponse> datasourceEnvResponse; + try { + LOGGER.info("modifyDataSourceEnv, request body: {}", envJson); + datasourceEnvResponse = metaDataClient.modifyDataSourceEnvBatch(clusterName, authUser, CREATE_SYSTEM, envJson); + } catch (JSONException e) { + throw new UnExpectedRequestException("Failed to format request parameter of dataSource env"); + } + if (ResponseStatusConstants.OK.equals(datasourceEnvResponse.getCode()) && Objects.nonNull(datasourceEnvResponse.getData())) { + Map dataMap = datasourceEnvResponse.getData(); + if (dataMap.containsKey("envs")) { + List> envMapList = (List>) dataMap.get("envs"); + Map> nameAndEnvMap = envMapList.stream() + .filter(envMap -> envMap.containsKey("id")) + .collect(Collectors.toMap(envMap -> MapUtils.getString(envMap, "envName"), Function.identity(), (oldVal, newVal) -> oldVal)); + linkisDataSourceEnvRequestList.forEach(linkisDataSourceEnvRequest -> { + if (nameAndEnvMap.containsKey(linkisDataSourceEnvRequest.getEnvName())) { + Map envMap = nameAndEnvMap.get(linkisDataSourceEnvRequest.getEnvName()); + if (envMap.containsKey("id")) { + linkisDataSourceEnvRequest.setId(MapUtils.getLong(envMap, "id")); + } + } + }); + } + } + return linkisDataSourceEnvRequestList; + } + + @Override + public LinkisDataSourceParamsResponse modifyDataSourceParams(ModifyDataSourceParameterRequest modifyDataSourceParameterRequest, String clusterName, String authUser) throws UnExpectedRequestException, MetaDataAcquireFailedException { + Map parameterMap = Maps.newHashMapWithExpectedSize(2); + parameterMap.put("connectParams", modifyDataSourceParameterRequest.getConnectParams()); + parameterMap.put("comment", modifyDataSourceParameterRequest.getComment()); + String parameterRequest; + try { + parameterRequest = objectMapper.writeValueAsString(parameterMap); + } catch (IOException e) { + throw new UnExpectedRequestException("Failed to format json parameter of dataSource"); + } + GeneralResponse> generalResponse; + try { + generalResponse = metaDataClient.modifyDataSourceParam(clusterName, authUser + , modifyDataSourceParameterRequest.getLinkisDataSourceId(), parameterRequest); + } catch (JSONException e) { + throw new UnExpectedRequestException("Failed to format json parameter of dataSource"); + } + if (Objects.nonNull(generalResponse.getData())) { + Object version = generalResponse.getData().get("version"); + Long versionId = Objects.nonNull(version) ? Long.valueOf(version.toString()) : null; + return new LinkisDataSourceParamsResponse(versionId); + } + throw new UnExpectedRequestException("Failed to modify parameter of dataSource"); + } + + @Override + public void deleteDataSource(Long linkisDataSourceId, String clusterName, String userName) throws UnExpectedRequestException, MetaDataAcquireFailedException { + GeneralResponse generalResponse = metaDataClient.deleteDataSource(clusterName, userName, linkisDataSourceId); + if (!ResponseStatusConstants.OK.equals(generalResponse.getCode())) { + throw new UnExpectedRequestException("Failed to delete DataSource to Linkis"); + } + } + + + private String createDatasourceEnvJson(Integer inputType, Integer verifyType, List dataSourceEnvList) throws UnExpectedRequestException { + if (CollectionUtils.isEmpty(dataSourceEnvList)) { + return StringUtils.EMPTY; + } + boolean isShared = isShared(verifyType); + boolean isAutoInput = isAutoInput(inputType); + for (LinkisDataSourceEnvRequest dataSourceEnv : dataSourceEnvList) { + if (Objects.isNull(dataSourceEnv.getConnectParamsRequest())) { + LOGGER.warn("Lack of connect parameter, envName: {}", dataSourceEnv.getEnvName()); + continue; + } + LinkisConnectParamsRequest connectParamsRequest = dataSourceEnv.getConnectParamsRequest(); + Map connectParamMap = new HashMap<>(); + if (isAutoInput) { + connectParamMap.put("database", dataSourceEnv.getDatabase()); + } + if (!isShared) { + String authType = connectParamsRequest.getAuthType(); + connectParamMap.put("authType", authType); + if (QualitisConstants.AUTH_TYPE_ACCOUNT_PWD.equals(authType)) { + connectParamMap.put("username", connectParamsRequest.getUsername()); + connectParamMap.put("password", CryptoUtils.encode(connectParamsRequest.getPassword())); + } else if (QualitisConstants.AUTH_TYPE_DPM.equals(authType)) { + connectParamMap.put("appid", connectParamsRequest.getAppId()); + connectParamMap.put("objectid", connectParamsRequest.getObjectId()); + connectParamMap.put("mkPrivate", connectParamsRequest.getMkPrivate()); + } + } + connectParamMap.put("host", connectParamsRequest.getHost()); + connectParamMap.put("port", connectParamsRequest.getPort()); + connectParamMap.put("params", connectParamsRequest.getConnectParam()); + dataSourceEnv.setConnectParams(connectParamMap); + } + try { + return objectMapper.writeValueAsString(dataSourceEnvList); + } catch (IOException e) { + throw new UnExpectedRequestException("Failed to format dataSource env json"); + } + } + + private String createDatasourceJson(LinkisDataSourceRequest linkisDataSourceRequest) throws UnExpectedRequestException { + if (MapUtils.isEmpty(linkisDataSourceRequest.getConnectParams())) { + Map connectParams = new HashMap<>(); + boolean isShared = isShared(linkisDataSourceRequest.getVerifyType()); + if (isShared && Objects.nonNull(linkisDataSourceRequest.getSharedConnectParams())) { + LinkisConnectParamsRequest connectParamsRequest = linkisDataSourceRequest.getSharedConnectParams(); + String authType = connectParamsRequest.getAuthType(); + connectParams.put("authType", authType); + if (QualitisConstants.AUTH_TYPE_ACCOUNT_PWD.equals(authType)) { + connectParams.put("username", connectParamsRequest.getUsername()); + connectParams.put("password", CryptoUtils.encode(connectParamsRequest.getPassword())); + } else if (QualitisConstants.AUTH_TYPE_DPM.equals(authType)) { + connectParams.put("appid", connectParamsRequest.getAppId()); + connectParams.put("objectid", connectParamsRequest.getObjectId()); + connectParams.put("mkPrivate", connectParamsRequest.getMkPrivate()); + } + } + connectParams.put("subSystem", linkisDataSourceRequest.getSubSystem()); + connectParams.put("share", isShared); + connectParams.put("dcn", isAutoInput(linkisDataSourceRequest.getInputType())); + connectParams.put("multi_env", true); + + linkisDataSourceRequest.setConnectParams(connectParams); + } + + validateConnectParams(linkisDataSourceRequest.getConnectParams()); + + try { + return objectMapper.writeValueAsString(linkisDataSourceRequest); + } catch (IOException e) { + throw new UnExpectedRequestException("Failed to format dataSource json"); + } + } + + private void validateConnectParams(Map connectParams) throws UnExpectedRequestException { + validateKey(connectParams, "subSystem"); + validateKey(connectParams, "share"); + validateKey(connectParams, "dcn"); + if (Boolean.TRUE.toString().equals(connectParams.getOrDefault("share", Boolean.FALSE.toString()))) { + validateKey(connectParams, "authType"); + String authType = (String) connectParams.get("authType"); + if (QualitisConstants.AUTH_TYPE_ACCOUNT_PWD.equals(authType)) { + validateKey(connectParams, "username"); + validateKey(connectParams, "password"); + } else if (QualitisConstants.AUTH_TYPE_DPM.equals(authType)) { + validateKey(connectParams, "appid"); + validateKey(connectParams, "objectid"); + validateKey(connectParams, "mkPrivate"); + } + } + } + + private void validateKey(Map map, String key) throws UnExpectedRequestException { + if (!map.containsKey(key)) { + throw new UnExpectedRequestException("Parameter must be null: " + key); + } + } + + private boolean isAutoInput(Integer inputType) { + return Integer.valueOf(QualitisConstants.DATASOURCE_MANAGER_INPUT_TYPE_AUTO).equals(inputType); + } + + private boolean isShared(Integer verifyType) { + return Integer.valueOf(QualitisConstants.DATASOURCE_MANAGER_VERIFY_TYPE_SHARE).equals(verifyType); + } + +} \ No newline at end of file diff --git a/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/MetaDataClientImpl.java b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/MetaDataClientImpl.java index fdc198b5..860091f9 100644 --- a/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/MetaDataClientImpl.java +++ b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/MetaDataClientImpl.java @@ -16,6 +16,11 @@ package com.webank.wedatasphere.qualitis.client.impl; +import com.google.common.cache.Cache; +import com.google.common.cache.CacheBuilder; +import com.google.common.collect.Maps; +import com.webank.wedatasphere.qualitis.client.RequestLinkis; +import com.webank.wedatasphere.qualitis.client.request.AskLinkisParameter; import com.webank.wedatasphere.qualitis.config.LinkisConfig; import com.webank.wedatasphere.qualitis.constant.LinkisResponseKeyEnum; import com.webank.wedatasphere.qualitis.constant.SpecCharEnum; @@ -23,42 +28,54 @@ import com.webank.wedatasphere.qualitis.entity.ClusterInfo; import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; import com.webank.wedatasphere.qualitis.metadata.client.MetaDataClient; +import com.webank.wedatasphere.qualitis.metadata.exception.MetaDataAcquireFailedException; import com.webank.wedatasphere.qualitis.metadata.request.GetClusterByUserRequest; import com.webank.wedatasphere.qualitis.metadata.request.GetColumnByUserAndTableRequest; -import com.webank.wedatasphere.qualitis.metadata.exception.MetaDataAcquireFailedException; import com.webank.wedatasphere.qualitis.metadata.request.GetDbByUserAndClusterRequest; import com.webank.wedatasphere.qualitis.metadata.request.GetTableByUserAndDbRequest; -import com.webank.wedatasphere.qualitis.metadata.request.GetUserTableByCsIdRequest; import com.webank.wedatasphere.qualitis.metadata.request.GetUserColumnByCsRequest; +import com.webank.wedatasphere.qualitis.metadata.request.GetUserTableByCsIdRequest; import com.webank.wedatasphere.qualitis.metadata.response.DataInfo; import com.webank.wedatasphere.qualitis.metadata.response.cluster.ClusterInfoDetail; import com.webank.wedatasphere.qualitis.metadata.response.column.ColumnInfoDetail; +import com.webank.wedatasphere.qualitis.metadata.response.datasource.LinkisDataSourceInfoDetail; import com.webank.wedatasphere.qualitis.metadata.response.db.DbInfoDetail; import com.webank.wedatasphere.qualitis.metadata.response.table.CsTableInfoDetail; import com.webank.wedatasphere.qualitis.metadata.response.table.PartitionStatisticsInfo; import com.webank.wedatasphere.qualitis.metadata.response.table.TableInfoDetail; import com.webank.wedatasphere.qualitis.metadata.response.table.TableStatisticsInfo; import com.webank.wedatasphere.qualitis.response.GeneralResponse; +import java.io.File; +import java.io.IOException; import java.io.UnsupportedEncodingException; import java.net.URLDecoder; +import java.nio.charset.Charset; import java.util.ArrayList; +import java.util.HashMap; import java.util.List; import java.util.Map; +import java.util.Objects; +import java.util.concurrent.TimeUnit; import java.util.stream.Collectors; import javax.ws.rs.core.UriBuilder; import org.apache.commons.collections.CollectionUtils; +import org.apache.commons.collections.MapUtils; import org.apache.commons.lang.StringUtils; +import org.apache.http.HttpStatus; +import org.apache.http.client.methods.CloseableHttpResponse; +import org.apache.http.client.methods.HttpPost; +import org.apache.http.entity.ContentType; +import org.apache.http.entity.mime.MultipartEntityBuilder; +import org.apache.http.impl.client.CloseableHttpClient; +import org.apache.http.impl.client.HttpClients; +import org.codehaus.jackson.map.ObjectMapper; +import org.json.JSONArray; import org.json.JSONException; import org.json.JSONObject; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; -import org.springframework.http.HttpEntity; -import org.springframework.http.HttpHeaders; -import org.springframework.http.HttpMethod; -import org.springframework.http.MediaType; import org.springframework.stereotype.Component; -import org.springframework.web.client.ResourceAccessException; import org.springframework.web.client.RestClientException; import org.springframework.web.client.RestTemplate; @@ -68,12 +85,12 @@ */ @Component public class MetaDataClientImpl implements MetaDataClient { - private final static String QUERY_CS_TABLE_PATH = "/dss/cs/tables"; - private final static String QUERY_CS_COLUMN_PATH = "/dss/cs/columns"; private final static String QUERY_WORKFLOW_TABLE_PATH = "/dss/workflow/tables"; private final static String QUERY_WORKFLOW_COLUMN_PATH = "/dss/workflow/columns"; - private static final String LINKIS_ONE_VERSION = "1.0"; + private static final String STATUS = "status"; + private static final String COLUMNS = "columns"; + private static final String INFO = "info"; @Autowired private ClusterInfoDao clusterInfoDao; @@ -84,54 +101,58 @@ public class MetaDataClientImpl implements MetaDataClient { @Autowired private LinkisConfig linkisConfig; + @Autowired + private RequestLinkis requestLinkis; + + private static final ObjectMapper OBJECT_MAPPER = new ObjectMapper(); + private static final Logger LOGGER = LoggerFactory.getLogger(MetaDataClientImpl.class); + /** + * key: cluster name, value: cluster object + */ + private Cache clusterInfoCache = CacheBuilder.newBuilder() + .expireAfterAccess(5, TimeUnit.MINUTES) + .expireAfterWrite(5, TimeUnit.MINUTES) + .build(); + @Override public DataInfo getClusterByUser(GetClusterByUserRequest request) { Long total = clusterInfoDao.countAll(); List allCluster = clusterInfoDao.findAllClusterInfo(request.getStartIndex(), request.getPageSize()); - DataInfo dataInfo = new DataInfo<>(total.intValue()); + DataInfo dataInfo = new DataInfo<>(); if (CollectionUtils.isEmpty(allCluster)) { return dataInfo; } + // Remove datasource cluster + allCluster = allCluster.stream().filter(clusterInfo -> ! clusterInfo.getClusterName().equals(linkisConfig.getDatasourceCluster())).collect(Collectors.toList()); + total -= 1; + List details = new ArrayList<>(); for (ClusterInfo clusterInfo : allCluster) { ClusterInfoDetail detail = new ClusterInfoDetail(clusterInfo.getClusterName()); details.add(detail); } + dataInfo.setTotalCount(total.intValue()); dataInfo.setContent(details); return dataInfo; } @Override public DataInfo getDbByUserAndCluster(GetDbByUserAndClusterRequest request) - throws UnExpectedRequestException, MetaDataAcquireFailedException { + throws UnExpectedRequestException, MetaDataAcquireFailedException { // Check existence of cluster name - ClusterInfo clusterInfo = checkClusterNameExists( - request.getClusterName()); + ClusterInfo clusterInfo = checkClusterNameExists(request.getClusterName()); String authUser = request.getLoginUser(); + // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDbPath()).toString(); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get db by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Start to get db by user and cluster by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, message: " + message); - } + Map response = gainResponseLinkisByGet(clusterInfo, authUser, url, "get db by user and cluster by linkis."); List allDbs = ((List>) ((Map) response.get("data")).get("dbs")).stream() - .map(o -> o.get("dbName")).collect(Collectors.toList()); + .map(o -> o.get("dbName")).collect(Collectors.toList()); DataInfo dataInfo = new DataInfo<>(allDbs.size()); if (CollectionUtils.isEmpty(allDbs)) { @@ -149,32 +170,18 @@ public DataInfo getDbByUserAndCluster(GetDbByUserAndClusterRequest @Override public DataInfo getTableByUserAndDb(GetTableByUserAndDbRequest request) - throws UnExpectedRequestException, MetaDataAcquireFailedException { + throws UnExpectedRequestException, MetaDataAcquireFailedException { ClusterInfo clusterInfo = checkClusterNameExists(request.getClusterName()); String authUser = request.getLoginUser(); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getTablePath()) - .queryParam("database", request.getDbName()).toString(); - - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get table by user and cluster and db by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finished to get table by user and cluster and db by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } + .queryParam("database", request.getDbName()).toString(); + + Map response = gainResponseLinkisByGet(clusterInfo, authUser, url, "get table by user and cluster and db by linkis."); List allTables = ((List>) ((Map) response.get("data")).get("tables")).stream() - .map(o -> o.get("tableName")).collect(Collectors.toList()); + .map(o -> o.get("tableName")).collect(Collectors.toList()); DataInfo dataInfo = new DataInfo<>(allTables.size()); @@ -192,32 +199,21 @@ public DataInfo getTableByUserAndDb(GetTableByUserAndDbRequest @Override public DataInfo getColumnByUserAndTable(GetColumnByUserAndTableRequest request) - throws UnExpectedRequestException, MetaDataAcquireFailedException { + throws UnExpectedRequestException, MetaDataAcquireFailedException { ClusterInfo clusterInfo = checkClusterNameExists(request.getClusterName()); String authUser = request.getLoginUser(); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getColumnPath()) - .queryParam("database", request.getDbName()).queryParam("table", request.getTableName()).toString(); - - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get column by user and cluster and db and table by linkis. url: {}, method: {}, body: {}", url, - javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finished to get table by user and cluster and and table by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } + .queryParam("database", request.getDbName()).queryParam("table", request.getTableName()).toString(); + + Map response = gainResponseLinkisByGet(clusterInfo, authUser, url, "get column by user and cluster and db and table by linkis."); - List> allTables = ((List>) ((Map) response.get("data")).get("columns")); + Map dataMap = (Map) response.get("data"); + if (!dataMap.containsKey(COLUMNS) || Objects.isNull(dataMap.get(COLUMNS))) { + return new DataInfo<>(); + } + List> allTables = ((List>) dataMap.get(COLUMNS)); DataInfo dataInfo = new DataInfo<>(allTables.size()); if (CollectionUtils.isEmpty(allTables)) { @@ -232,32 +228,58 @@ public DataInfo getColumnByUserAndTable(GetColumnByUserAndTabl return dataInfo; } + private Map gainResponseLinkisByGet(ClusterInfo clusterInfo, String authUser, String url, String logMessage) throws UnExpectedRequestException, MetaDataAcquireFailedException { + return requestLinkis.getLinkisResponseByGet(new AskLinkisParameter(url, clusterInfo.getLinkisToken(), authUser, logMessage)); + } + + private Map gainResponseLinkisByDelete(ClusterInfo clusterInfo, String authUser, String url, String logMessage) throws MetaDataAcquireFailedException { + return requestLinkis.removeLinkisResponseByDelete(new AskLinkisParameter(url, clusterInfo.getLinkisToken(), authUser, logMessage)); + } + + private Map gainResponseLinkisByGetRetry(ClusterInfo clusterInfo, String authUser, String url, String logMessage) throws Exception { + return requestLinkis.getLinkisResponseByGetRetry(new AskLinkisParameter(url, clusterInfo.getLinkisToken(), authUser, logMessage)); + } + + private Map gainResponseLinkisByPost(ClusterInfo clusterInfo, String authUser, String url, String logMessage) throws UnExpectedRequestException, MetaDataAcquireFailedException { + return requestLinkis.getLinkisResponseByPost(new AskLinkisParameter(url, clusterInfo.getLinkisToken(), authUser, logMessage)); + } + + private Map gainResponseLinkisByPostBringJson(ClusterInfo clusterInfo, String authUser, String url, String logMessage, JSONObject jsonObject) throws UnExpectedRequestException, MetaDataAcquireFailedException { + return requestLinkis.getLinkisResponseByPostBringJson(new AskLinkisParameter(url, clusterInfo.getLinkisToken(), authUser, logMessage), jsonObject); + } + + private Map gainResponseLinkisByPostBringJsonArray(ClusterInfo clusterInfo, String authUser, String url, String logMessage, JSONArray jsonArray) throws UnExpectedRequestException, MetaDataAcquireFailedException { + return requestLinkis.getLinkisResponseByPostBringJsonArray(new AskLinkisParameter(url, clusterInfo.getLinkisToken(), authUser, logMessage), jsonArray); + } + + private Map gainResponseLinkisByPostBringJsonRetry(ClusterInfo clusterInfo, String authUser, String url, String logMessage, JSONObject jsonObject) throws Exception { + return requestLinkis.getLinkisResponseByPostBringJsonRetry(new AskLinkisParameter(url, clusterInfo.getLinkisToken(), authUser, logMessage), jsonObject); + } + + private Map gainResponseLinkisByPut(ClusterInfo clusterInfo, String authUser, String url, String logMessage) throws UnExpectedRequestException, MetaDataAcquireFailedException { + return requestLinkis.getLinkisResponseByPut(new AskLinkisParameter(url, clusterInfo.getLinkisToken(), authUser, logMessage)); + } + + private Map gainResponseLinkisByPutBringJson(ClusterInfo clusterInfo, String authUser, String url, String logMessage, JSONObject jsonObject) throws UnExpectedRequestException, MetaDataAcquireFailedException { + return requestLinkis.getLinkisResponseByPutBringJson(new AskLinkisParameter(url, clusterInfo.getLinkisToken(), authUser, logMessage),jsonObject); + } + + private Map gainResponseLinkisByPutBringJsonArray(ClusterInfo clusterInfo, String authUser, String url, String logMessage, JSONArray jsonArray) throws UnExpectedRequestException, MetaDataAcquireFailedException { + return requestLinkis.getLinkisResponseByPutBringJsonArray(new AskLinkisParameter(url, clusterInfo.getLinkisToken(), authUser, logMessage), jsonArray); + } + + @Override public String getTableBasicInfo(String clusterName, String dbName, String tableName, String userName) - throws MetaDataAcquireFailedException, UnExpectedRequestException { + throws MetaDataAcquireFailedException, UnExpectedRequestException { ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get table comment. String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getTableInfo()) - .queryParam("database", dbName).queryParam("tableName", tableName).toString(); - - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", userName); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get table comment by user and cluster and db by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, - entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finished to get table comment by user and cluster and db by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } + .queryParam("database", dbName).queryParam("tableName", tableName).toString(); + + Map response = gainResponseLinkisByGet(clusterInfo, userName, url, "get table comment by user and cluster and db by linkis."); Object result = ((Map) ((Map) ((Map) response.get("data")).get("tableBaseInfo")).get("base")) - .get("comment"); + .get("comment"); String comment = result == null ? "no comment" : result.toString(); return comment; } @@ -265,7 +287,7 @@ public String getTableBasicInfo(String clusterName, String dbName, String tableN @Override public DataInfo getTableByCsId(GetUserTableByCsIdRequest request) - throws MetaDataAcquireFailedException, UnExpectedRequestException { + throws Exception { DataInfo result = new DataInfo<>(); List csTableInfoDetailList = new ArrayList<>(); try { @@ -274,17 +296,9 @@ public DataInfo getTableByCsId(GetUserTableByCsIdRequest requ String authUser = request.getLoginUser(); // send request - String url; - if (clusterInfo.getClusterType().endsWith(LINKIS_ONE_VERSION)) { - url = getPath(clusterInfo.getLinkisAddress()).path(QUERY_WORKFLOW_TABLE_PATH).toString(); - } else { - url = getPath(clusterInfo.getLinkisAddress()).path(QUERY_CS_TABLE_PATH).toString(); - } + String url = getPath(clusterInfo.getLinkisAddress()).path(QUERY_WORKFLOW_TABLE_PATH).toString(); + - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); JSONObject jsonObject = new JSONObject(); try { jsonObject.put("contextID", request.getCsId()); @@ -293,24 +307,15 @@ public DataInfo getTableByCsId(GetUserTableByCsIdRequest requ LOGGER.error(e.getMessage(), e); throw new UnExpectedRequestException("Failed to construct http body json with context ID and node name", 500); } - - HttpEntity entity = new HttpEntity<>(jsonObject.toString(), headers); - LOGGER.info("Start to get table with context service ID and node name by restful API. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.POST, entity); - Map response = restTemplate.exchange(url, HttpMethod.POST, entity, Map.class).getBody(); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - LOGGER.info("Finished to get table with context service ID and node name by restful API. response: {}", response); + // Retry + Map response = gainResponseLinkisByPostBringJsonRetry(clusterInfo, authUser, url, "get table with context service ID and node name by restful API.",jsonObject); Map data = (Map) response.get("data"); List> tables = (List>) data.get("tables"); if (tables == null || tables.size() == 0) { return result; } LOGGER.info("Successfully to get tables with context service ID and node name by restful API. csId: {}, nodeName: {}, tables: {}", - request.getCsId(), request.getNodeName(), tables); + request.getCsId(), request.getNodeName(), tables); for (Map table : tables) { CsTableInfoDetail csTableInfoDetail = new CsTableInfoDetail(); csTableInfoDetail.setTableName(table.get("tableName").toString()); @@ -327,26 +332,13 @@ public DataInfo getTableByCsId(GetUserTableByCsIdRequest requ } @Override - public List getColumnInfo(String clusterName, String dbName, String tableName, String userName) throws MetaDataAcquireFailedException, UnExpectedRequestException { + public List getColumnInfo(String clusterName, String dbName, String tableName, String userName) throws Exception { ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get table comment. String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getColumnInfo()) - .queryParam("database", dbName).queryParam("tableName", tableName).toString(); - - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", userName); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get column info by user and cluster and db and table by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finished to get column info by user and cluster and db and table by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } + .queryParam("database", dbName).queryParam("tableName", tableName).toString(); + // Retry + Map response = gainResponseLinkisByGetRetry(clusterInfo, userName, url, "get column info by user and cluster and db and table by linkis."); List> tableFieldInfo = (List>) ((Map) response.get("data")).get("tableFieldsInfo"); List result = new ArrayList<>(); @@ -354,7 +346,7 @@ public List getColumnInfo(String clusterName, String dbName, S ColumnInfoDetail columnInfoDetail = new ColumnInfoDetail(); columnInfoDetail.setFieldName(map.get("name").toString()); columnInfoDetail.setDataType(map.get("type").toString()); - if (map.get("length") != null && ! "".equals(map.get("length").toString())) { + if (map.get("length") != null && !"".equals(map.get("length").toString())) { columnInfoDetail.setColumnLen(Integer.parseInt(map.get("length").toString())); } columnInfoDetail.setColumnAlias(map.get("alias") == null ? "" : map.get("alias").toString()); @@ -368,26 +360,18 @@ public List getColumnInfo(String clusterName, String dbName, S @Override public DataInfo getColumnByCsId(GetUserColumnByCsRequest request) - throws MetaDataAcquireFailedException, UnExpectedRequestException { + throws Exception { DataInfo result = new DataInfo<>(); List list = new ArrayList<>(); try { LOGGER.info("Start to get columns with context service ID and table's context key. csId: {}, contextKey: {}", request.getCsId(), - request.getContextKey()); + request.getContextKey()); ClusterInfo clusterInfo = checkClusterNameExists(request.getClusterName()); String authUser = request.getLoginUser(); // send request - String url; - if (clusterInfo.getClusterType().endsWith(LINKIS_ONE_VERSION)) { - url = getPath(clusterInfo.getLinkisAddress()).path(QUERY_WORKFLOW_COLUMN_PATH).toString(); - } else { - url = getPath(clusterInfo.getLinkisAddress()).path(QUERY_CS_COLUMN_PATH).toString(); - } - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); + String url = getPath(clusterInfo.getLinkisAddress()).path(QUERY_WORKFLOW_COLUMN_PATH).toString(); + JSONObject jsonObject = new JSONObject(); try { jsonObject.put("contextID", request.getCsId()); @@ -395,25 +379,15 @@ public DataInfo getColumnByCsId(GetUserColumnByCsRequest reque } catch (JSONException e) { LOGGER.error("Failed to construct http body json, exception is : {}", e); } + Map response = gainResponseLinkisByPostBringJsonRetry(clusterInfo, authUser, url, "get column with context service ID and table's context key by restful API.", jsonObject); - HttpEntity entity = new HttpEntity<>(jsonObject.toString(), headers); - LOGGER.info("Start to get column with context service ID and table's context key by restful API. url: {}, method: {}, body: {}", url, - javax.ws.rs.HttpMethod.POST, entity); - Map response = restTemplate.exchange(url, HttpMethod.POST, entity, Map.class).getBody(); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - LOGGER.info("Finished to get column with context service ID and table's context key by restful API. response: {}", response); Map data = (Map) response.get("data"); List> columns = (List>) data.get("columns"); if (columns == null || columns.size() == 0) { return result; } LOGGER.info("Successfully to get columns with context service ID and table's context key by restful API. csId: {}, contextKey: {}", - request.getCsId(), request.getContextKey()); + request.getCsId(), request.getContextKey()); for (Map column : columns) { ColumnInfoDetail columnInfoDetail = new ColumnInfoDetail(); columnInfoDetail.setFieldName(column.get("columnName").toString()); @@ -433,54 +407,34 @@ public DataInfo getColumnByCsId(GetUserColumnByCsRequest reque @Override public TableStatisticsInfo getTableStatisticsInfo(String clusterName, String dbName, String tableName, String userName) - throws UnExpectedRequestException, MetaDataAcquireFailedException, RestClientException { + throws Exception { ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getTableStatistics()) - .queryParam("database", dbName) - .queryParam("tableName", tableName).toString(); + .queryParam("database", dbName) + .queryParam("tableName", tableName).toString(); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", userName); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get table info by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response; - try { - response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - } catch (ResourceAccessException e) { - LOGGER.error(e.getMessage(), e); - throw new MetaDataAcquireFailedException("Error! Can not get table info from linkis, exception: " + e.getMessage(), 500); - } - LOGGER.info("Finish to get table info by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get table info from linkis, exception: " + message); - } + Map response = gainResponseLinkisByGetRetry(clusterInfo, userName, url, "get table info by linkis"); Map result = (Map) ((Map) response.get("data")).get("tableStatisticInfo"); TableStatisticsInfo tableStatisticsInfo = new TableStatisticsInfo(); tableStatisticsInfo.setTableFileCount(Integer.parseInt(result.get("fileNum").toString())); tableStatisticsInfo.setTableSize(result.get("tableSize").toString()); - tableStatisticsInfo.setPartitions((List) result.get("partitions")); + tableStatisticsInfo.setPartitions((List>) result.get("partitions")); return tableStatisticsInfo; } @Override public PartitionStatisticsInfo getPartitionStatisticsInfo(String clusterName, String dbName, String tableName, String partitionPath, String userName) - throws UnExpectedRequestException, MetaDataAcquireFailedException, RestClientException { + throws Exception { ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getPartitionStatistics()) - .queryParam("database", dbName) - .queryParam("tableName", tableName) - .queryParam("partitionPath", partitionPath).toString(); + .queryParam("database", dbName) + .queryParam("tableName", tableName) + .queryParam("partitionPath", partitionPath).toString(); try { url = URLDecoder.decode(url, "UTF-8"); } catch (UnsupportedEncodingException e) { @@ -488,32 +442,13 @@ public PartitionStatisticsInfo getPartitionStatisticsInfo(String clusterName, St throw new UnExpectedRequestException("Decode get partition statistic info exception", 500); } - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", userName); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get partition info by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = null; - try { - response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - } catch (ResourceAccessException e) { - LOGGER.error(e.getMessage(), e); - throw new MetaDataAcquireFailedException("Error! Can not get partition info from linkis, exception: " + e.getMessage(), 500); - } - LOGGER.info("Finish to get partition info by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get partition info from linkis, exception: " + message); - } + Map response = gainResponseLinkisByGetRetry(clusterInfo, userName, url, "get partition info by linkis."); Map result = (Map) ((Map) response.get("data")).get("partitionStatisticInfo"); PartitionStatisticsInfo partitionStatisticsInfo = new PartitionStatisticsInfo(); partitionStatisticsInfo.setPartitionChildCount(Integer.parseInt(result.get("fileNum").toString())); + partitionStatisticsInfo.setModificationTime((Long) result.get("modificationTime")); partitionStatisticsInfo.setPartitionSize(result.get("partitionSize").toString()); - partitionStatisticsInfo.setPartitions((List) result.get("childrens")); + partitionStatisticsInfo.setPartitions((List>) result.get("childrens")); return partitionStatisticsInfo; } @@ -525,8 +460,8 @@ public boolean fieldExist(String col, List cols, Map 0; + if (col.equals(SpecCharEnum.STAR.getValue())) { + return cols.size() > 0; } String[] colsInfo = col.split("\\|"); int diff = colsInfo.length; @@ -534,7 +469,7 @@ public boolean fieldExist(String col, List cols, Map cols, Map 0) { int diff = mappingCols.size(); - for (String colName : mappingCols.keySet()) { + for (Map.Entry entry : mappingCols.entrySet()) { + String key = entry.getKey(); + String value = entry.getValue(); for (ColumnInfoDetail columnInfoDetail : cols) { - if (columnInfoDetail.getFieldName().equals(colName) && columnInfoDetail.getDataType().equals(mappingCols.get(colName))) { - diff --; + if (columnInfoDetail.getFieldName().equals(key) && columnInfoDetail.getDataType().equals(value)) { + diff--; break; } } @@ -560,68 +497,84 @@ public boolean fieldExist(String col, List cols, Map getAllDataSourceTypes(String clusterName, String authUser) throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> getAllDataSourceTypes(String clusterName, String authUser) throws UnExpectedRequestException, MetaDataAcquireFailedException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceTypes()).toString(); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get data source types by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finish to get data source types by user and cluster by linkis. response: {}", response); - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); - List types = (List) data.get("type_list"); + Map response = gainResponseLinkisByGet(clusterInfo, authUser, url, "get data source types by user and cluster by linkis."); + + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); return new GeneralResponse<>("200", "Success to get all datasource types", data); } @Override - public GeneralResponse getDataSourceEnv(String clusterName, String authUser) - throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> getDataSourceEnv(String clusterName, String authUser) + throws UnExpectedRequestException, MetaDataAcquireFailedException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceEnv()).toString(); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); + Map response = gainResponseLinkisByGet(clusterInfo, authUser, url, "get data source env by user and cluster by linkis."); - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get data source env by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finish to get data source env by user and cluster by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); - List types = (List) data.get("query_list"); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); return new GeneralResponse<>("200", "Success to get datasource env", data); } @Override - public GeneralResponse getDataSourceInfoPage(String clusterName, String authUser, int page, int size, String searchName, - Long typeId) throws UnExpectedRequestException, MetaDataAcquireFailedException, UnsupportedEncodingException { + public GeneralResponse> createDataSourceEnvBatch(String clusterName, String authUser, String createSystem, String datasourceEnvs) throws UnExpectedRequestException, MetaDataAcquireFailedException, JSONException { + // Check existence of cluster name + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + // send request to get dbs + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceEnvCreateBatch()) + .queryParam("system", "Qualitis") + .toString(); + Map response = gainResponseLinkisByPostBringJsonArray(clusterInfo, authUser, url, "batch create data source env param by user and cluster by linkis." + , new JSONArray(datasourceEnvs)); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + return new GeneralResponse<>("200", "Success to create datasource env connect params", data); + } + + @Override + public GeneralResponse> modifyDataSourceEnvBatch(String clusterName, String authUser, String createSystem, String datasourceEnvs) throws UnExpectedRequestException, MetaDataAcquireFailedException, JSONException { + // Check existence of cluster name + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + // send request to get dbs + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceEnvModifyBatch()) + .queryParam("system", "Qualitis") + .toString(); + + Map response = gainResponseLinkisByPutBringJsonArray(clusterInfo, authUser, url, "modify data source env by user and cluster by linkis." + ,new JSONArray(datasourceEnvs)); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + return new GeneralResponse<>("200", "Success to modify datasource", data); + } + + @Override + public GeneralResponse> getDatasourceEnvById(String clusterName, String authUser, Long envId) throws UnExpectedRequestException, MetaDataAcquireFailedException { + // Check existence of cluster name + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + // send request to get dbs + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceEnvDetail()).toString() + .replace("{ENV_ID}", envId.toString()); + + Map response = gainResponseLinkisByGet(clusterInfo, authUser, url, "get data source env by user and cluster by linkis."); + + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + return new GeneralResponse<>("200", "Success to get datasource version", data); + } + + @Override + public GeneralResponse> getDataSourceInfoPage(String clusterName, String authUser, int page, int size, String searchName, + Long typeId) throws UnExpectedRequestException, MetaDataAcquireFailedException, UnsupportedEncodingException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs UriBuilder uriBuilder = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceInfo()) - .queryParam("currentPage", page).queryParam("pageSize", size); + .queryParam("currentPage", page).queryParam("pageSize", size); if (StringUtils.isNotBlank(searchName)) { uriBuilder.queryParam("name", searchName); } @@ -632,55 +585,45 @@ public GeneralResponse getDataSourceInfoPage(String clusterName, String aut String url = uriBuilder.toString(); url = URLDecoder.decode(url, "UTF-8"); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); + Map response = gainResponseLinkisByGet(clusterInfo, authUser, url, "get data source info by user and cluster by linkis."); - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get data source info by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finish to get data source info by user and cluster by linkis. response: {}", response); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + return new GeneralResponse<>("200", "Success to get datasource info", data); + } - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); + @Override + public GeneralResponse> getDataSourceInfoByIds(String clusterName, String userName, List dataSourceIds) throws UnExpectedRequestException, MetaDataAcquireFailedException, IOException { + if (CollectionUtils.isEmpty(dataSourceIds)) { + return new GeneralResponse<>("200", "Success to get datasource info by ids", null); } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); - List types = (List) data.get("query_list"); - return new GeneralResponse<>("200", "Success to get datasource info", data); + // Check existence of cluster name + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + // send request to get dbs + UriBuilder uriBuilder = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceInfoIds()) + .queryParam("ids", new ObjectMapper().writeValueAsString(dataSourceIds)); + String url = uriBuilder.toString(); + url = URLDecoder.decode(url, "UTF-8"); + Map response = gainResponseLinkisByGet(clusterInfo, userName, url, "get data source info by user and cluster by linkis."); + + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + return new GeneralResponse<>("200", "Success to get datasource info by ids", data); } @Override - public GeneralResponse getDataSourceVersions(String clusterName, String authUser, Long dataSourceId) throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> getDataSourceVersions(String clusterName, String authUser, Long dataSourceId) throws UnExpectedRequestException, MetaDataAcquireFailedException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceVersions()).toString().replace("{DATA_SOURCE_ID}", dataSourceId.toString()); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get data source versions by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finish to get data source versions by user and cluster by linkis. response: {}", response); + Map response = gainResponseLinkisByGet(clusterInfo, authUser, url, "get data source versions by user and cluster by linkis."); - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); - List types = (List) data.get("versions"); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); return new GeneralResponse<>("200", "Success to get datasource version", data); } @Override - public GeneralResponse getDataSourceInfoDetail(String clusterName, String authUser, Long dataSourceId, Long versionId) throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> getDataSourceInfoDetail(String clusterName, String authUser, Long dataSourceId, Long versionId) throws Exception { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs @@ -689,108 +632,55 @@ public GeneralResponse getDataSourceInfoDetail(String clusterName, String a uriBuilder.path(versionId.toString()); } String url = uriBuilder.toString(); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get data source info detail by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finish to get data source info detail by user and cluster by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); - Map types = (Map) data.get("info"); + Map response = gainResponseLinkisByGetRetry(clusterInfo, authUser, url, "get data source info detail by user and cluster by linkis."); + + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); return new GeneralResponse<>("200", "Success to get datasource detail info", data); } @Override - public GeneralResponse getDataSourceInfoDetailByName(String clusterName, String authUser, String dataSourceName) throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> getDataSourceInfoDetailByName(String clusterName, String authUser, String dataSourceName) throws UnExpectedRequestException, MetaDataAcquireFailedException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs UriBuilder uriBuilder = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceInfoName()).path(dataSourceName); String url = uriBuilder.toString(); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get data source info detail by user and cluster and name by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finish to get data source info detail by user and cluster and name by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + Map response = gainResponseLinkisByGet(clusterInfo, authUser, url, "get data source info detail by user and cluster and name by linkis."); + + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); return new GeneralResponse<>("200", "Success to get datasource info detail by datasource name", data); } @Override - public GeneralResponse getDataSourceKeyDefine(String clusterName, String authUser, Long keyId) throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> getDataSourceKeyDefine(String clusterName, String authUser, Long keyId) throws UnExpectedRequestException, MetaDataAcquireFailedException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceKeyDefine()).path(keyId.toString()).toString(); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get data source key define by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finish to get data source key define by user and cluster by linkis. response: {}", response); + Map response = gainResponseLinkisByGet(clusterInfo, authUser, url, "get data source key define by user and cluster by linkis."); - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); return new GeneralResponse<>("200", "Success to get datasource key define", data); } @Override - public GeneralResponse connectDataSource(String clusterName, String authUser, String jsonRequest) throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> connectDataSource(String clusterName, String authUser, String jsonRequest) throws UnExpectedRequestException, MetaDataAcquireFailedException, JSONException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceConnect()).toString(); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); + Map response = gainResponseLinkisByPostBringJson(clusterInfo, authUser, url, "connect data source by user and cluster by linkis.",new JSONObject(jsonRequest)); - HttpEntity entity = new HttpEntity<>(jsonRequest, headers); - LOGGER.info("Start to connect data source by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.POST, entity); - Map response = restTemplate.exchange(url, HttpMethod.POST, entity, Map.class).getBody(); - LOGGER.info("Finish to connect data source by user and cluster by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); return new GeneralResponse<>("200", "{&CONNECT_SUCCESS}", data); } @Override - public GeneralResponse getDataSourceConnectParams(String clusterName, String authUser, Long dataSourceId, Long versionId) throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> getDataSourceConnectParams(String clusterName, String authUser, Long dataSourceId, Long versionId) throws Exception { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs @@ -799,183 +689,105 @@ public GeneralResponse getDataSourceConnectParams(String clusterName, Strin uriBuilder.path(versionId.toString()); } String url = uriBuilder.toString().replace("{DATA_SOURCE_ID}", dataSourceId.toString()); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get data source connect params by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finish to get data source connect params by user and cluster by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + Map response = gainResponseLinkisByGetRetry(clusterInfo, authUser, url, "get data source connect params by user and cluster by linkis."); + + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); return new GeneralResponse<>("200", "Success to get datasource connect params", data); } @Override - public GeneralResponse publishDataSource(String clusterName, String authUser, Long dataSourceId, Long versionId) throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> publishDataSource(String clusterName, String authUser, Long dataSourceId, Long versionId) throws UnExpectedRequestException, MetaDataAcquireFailedException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourcePublish()).path(dataSourceId.toString()).path(versionId.toString()).toString(); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); + Map response = gainResponseLinkisByPost(clusterInfo, authUser, url, "publish data source by user and cluster by linkis."); - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to publish data source by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.POST, entity); - Map response = restTemplate.exchange(url, HttpMethod.POST, entity, Map.class).getBody(); - LOGGER.info("Finish to publish data source by user and cluster by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); return new GeneralResponse<>("200", "Success to publish datasource", data); } @Override - public GeneralResponse expireDataSource(String clusterName, String authUser, Long dataSourceId) throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> expireDataSource(String clusterName, String authUser, Long dataSourceId) throws UnExpectedRequestException, MetaDataAcquireFailedException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceExpire()).toString().replace("{DATA_SOURCE_ID}", dataSourceId.toString()); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); + Map response = gainResponseLinkisByPut(clusterInfo, authUser, url, "expire data source by user and cluster by linkis."); - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to expire data source by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.PUT, entity); - Map response = restTemplate.exchange(url, HttpMethod.PUT, entity, Map.class).getBody(); - LOGGER.info("Finish to expire data source by user and cluster by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); return new GeneralResponse<>("200", "Success to expire datasource", data); } @Override - public GeneralResponse modifyDataSource(String clusterName, String authUser, Long dataSourceId, String jsonRequest) - throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> modifyDataSource(String clusterName, String authUser, Long dataSourceId, String jsonRequest) + throws UnExpectedRequestException, MetaDataAcquireFailedException, JSONException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceModify()).toString().replace("{DATA_SOURCE_ID}", dataSourceId.toString()); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(jsonRequest, headers); - LOGGER.info("Start to modify data source by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.PUT, entity); - Map response = restTemplate.exchange(url, HttpMethod.PUT, entity, Map.class).getBody(); - LOGGER.info("Finish to modify data source by user and cluster by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + Map response = gainResponseLinkisByPutBringJson(clusterInfo, authUser, url, "modify data source by user and cluster by linkis.",new JSONObject(jsonRequest)); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); return new GeneralResponse<>("200", "Success to modify datasource", data); } @Override - public GeneralResponse modifyDataSourceParam(String clusterName, String authUser, Long dataSourceId, String jsonRequest) - throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> modifyDataSourceParam(String clusterName, String authUser, Long dataSourceId, String jsonRequest) + throws UnExpectedRequestException, MetaDataAcquireFailedException, JSONException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceInitVersion()).toString().replace("{DATA_SOURCE_ID}", dataSourceId.toString()); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); + Map response = gainResponseLinkisByPostBringJson(clusterInfo, authUser, url, "modify data source param by user and cluster by linkis.",new JSONObject(jsonRequest)); - HttpEntity entity = new HttpEntity<>(jsonRequest, headers); - LOGGER.info("Start to modify data source param by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.POST, entity); - Map response = restTemplate.exchange(url, HttpMethod.POST, entity, Map.class).getBody(); - LOGGER.info("Finish to modify data source param by user and cluster by linkis. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); return new GeneralResponse<>("200", "Success to modify datasource connect params", data); } @Override - public GeneralResponse createDataSource(String clusterName, String authUser, String jsonRequest) - throws UnExpectedRequestException, MetaDataAcquireFailedException { + public GeneralResponse> createDataSource(String clusterName, String authUser, String jsonRequest) + throws UnExpectedRequestException, MetaDataAcquireFailedException, JSONException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceCreate()).toString(); - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); + Map response = gainResponseLinkisByPostBringJson(clusterInfo, authUser, url, "create data source by user and cluster by linkis.", new JSONObject(jsonRequest)); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + return new GeneralResponse<>("200", "Success to create datasource", data); + } - HttpEntity entity = new HttpEntity<>(jsonRequest, headers); - LOGGER.info("Start to create data source by user and cluster by linkis. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.POST, entity); - Map response = restTemplate.exchange(url, HttpMethod.POST, entity, Map.class).getBody(); - LOGGER.info("Finish to create data source by user and cluster by linkis. response: {}", response); + @Override + public GeneralResponse> deleteDataSource(String clusterName, String userName, Long dataSourceId) throws UnExpectedRequestException, MetaDataAcquireFailedException { + // Check existence of cluster name + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + // send request to get dbs + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceDelete()).toString().replace("{DATA_SOURCE_ID}", dataSourceId.toString()); + Map response = gainResponseLinkisByDelete(clusterInfo, userName, url, "delete data source by user and cluster by linkis."); - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); - } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); - return new GeneralResponse<>("200", "Success to create datasource", data); + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + return new GeneralResponse<>("200", "Success to delete datasource", data); } @Override - public Map getDbsByDataSource(String clusterName, String authUser, Long dataSourceId) throws UnExpectedRequestException, MetaDataAcquireFailedException { + public Map getDbsByDataSourceName(String clusterName, String authUser, String dataSourceName, Long envId) throws UnExpectedRequestException, MetaDataAcquireFailedException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs - String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceDb()).queryParam("system", "Qualitis").toString().replace("{DATA_SOURCE_ID}", dataSourceId.toString()); - - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get dbs by data source. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finish to get dbs by data source. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); + UriBuilder url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceQueryDb()).queryParam("system", "Qualitis") + .queryParam("dataSourceName", dataSourceName); + if (Objects.nonNull(envId)) { + url.queryParam("envId", envId); } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); - List dbs = (List) data.get("dbs"); + Map response = gainResponseLinkisByGet(clusterInfo, authUser, url.toString(), "get dbs by data source."); + + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + List> dbs = (List>) data.get("dbs"); if (CollectionUtils.isEmpty(dbs)) { LOGGER.info("No dbs with data source to be choosed."); } @@ -983,31 +795,22 @@ public Map getDbsByDataSource(String clusterName, String authUser, Long dataSour } @Override - public Map getTablesByDataSource(String clusterName, String authUser, Long dataSourceId, String dbName) throws UnExpectedRequestException, MetaDataAcquireFailedException { + public Map getTablesByDataSourceName(String clusterName, String authUser, String dataSourceName, String dbName, Long envId) throws UnExpectedRequestException, MetaDataAcquireFailedException { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs - String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceTable()).queryParam("system", "Qualitis").toString() - .replace("{DATA_SOURCE_ID}", dataSourceId.toString()) - .replace("{DATA_SOURCE_DB}", dbName); - - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get tables by data source. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finish to get tables by data source. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); + UriBuilder url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceQueryTable()) + .queryParam("system", "Qualitis") + .queryParam("dataSourceName", dataSourceName) + .queryParam("database", dbName); + if (Objects.nonNull(envId)) { + url.queryParam("envId", envId); } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); - List tables = (List) data.get("tables"); + + Map response = gainResponseLinkisByGet(clusterInfo, authUser, url.toString(), "get tables by data source."); + + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + List> tables = (List>) data.get("tables"); if (CollectionUtils.isEmpty(tables)) { LOGGER.info("No tables with data source to be choosed."); } @@ -1015,38 +818,60 @@ public Map getTablesByDataSource(String clusterName, String authUser, Long dataS } @Override - public DataInfo getColumnsByDataSource(String clusterName, String authUser, Long dataSourceId, String dbName, String tableName) throws UnExpectedRequestException, MetaDataAcquireFailedException { + public DataInfo getColumnsByDataSource(String clusterName, String authUser, Long dataSourceId, String dbName, String tableName) throws Exception { // Check existence of cluster name ClusterInfo clusterInfo = checkClusterNameExists(clusterName); // send request to get dbs String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceColumn()).queryParam("system", "Qualitis").toString() - .replace("{DATA_SOURCE_ID}", dataSourceId.toString()) - .replace("{DATA_SOURCE_DB}", dbName) - .replace("{DATA_SOURCE_TABLE}", tableName); - - HttpHeaders headers = new HttpHeaders(); - headers.setContentType(MediaType.APPLICATION_JSON); - headers.add("Token-User", authUser); - headers.add("Token-Code", clusterInfo.getLinkisToken()); - - HttpEntity entity = new HttpEntity<>(headers); - LOGGER.info("Start to get columns by data source. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.GET, entity); - Map response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); - LOGGER.info("Finish to get columns by data source. response: {}", response); - - if (! checkResponse(response)) { - String message = (String) response.get("message"); - LOGGER.error("Error! Can not get meta data from linkis, message: " + message); - throw new MetaDataAcquireFailedException("Error! Can not get meta data from linkis, exception: " + message); + .replace("{DATA_SOURCE_ID}", dataSourceId.toString()) + .replace("{DATA_SOURCE_DB}", dbName) + .replace("{DATA_SOURCE_TABLE}", tableName); + + Map response = gainResponseLinkisByGetRetry(clusterInfo, authUser, url, "get columns by data source."); + + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + DataInfo result = new DataInfo<>(); + List> tables = (List>) data.get("columns"); + if (CollectionUtils.isEmpty(tables)) { + LOGGER.info("No columns with data source to be choosed."); + } else { + List columnInfoDetailList = new ArrayList<>(tables.size()); + for (Map map : tables) { + ColumnInfoDetail columnInfoDetail = new ColumnInfoDetail(); + columnInfoDetail.setFieldName((String) map.get("name")); + columnInfoDetail.setDataType((String) map.get("type")); + columnInfoDetailList.add(columnInfoDetail); + } + result.setTotalCount(columnInfoDetailList.size()); + result.setContent(columnInfoDetailList); + } + return result; + } + + @Override + public DataInfo getColumnsByDataSourceName(String clusterName, String authUser, String dataSourceName, String dbName, String tableName, Long envId) throws Exception { + // Check existence of cluster name + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + // send request to get dbs + UriBuilder url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDatasourceQueryColumn()) + .queryParam("system", "Qualitis") + .queryParam("dataSourceName", dataSourceName) + .queryParam("database", dbName) + .queryParam("table", tableName); + if (Objects.nonNull(envId)) { + url.queryParam("envId", envId); } - Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + + Map response = gainResponseLinkisByGetRetry(clusterInfo, authUser, url.toString(), "get columns by data source."); + + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); DataInfo result = new DataInfo<>(); - List tables = (List) data.get("columns"); + List> tables = (List>) data.get("columns"); if (CollectionUtils.isEmpty(tables)) { LOGGER.info("No columns with data source to be choosed."); } else { List columnInfoDetailList = new ArrayList<>(tables.size()); - for (Map map : tables) { + for (Map map : tables) { ColumnInfoDetail columnInfoDetail = new ColumnInfoDetail(); columnInfoDetail.setFieldName((String) map.get("name")); columnInfoDetail.setDataType((String) map.get("type")); @@ -1058,20 +883,402 @@ public DataInfo getColumnsByDataSource(String clusterName, Str return result; } + @Override + public int getUndoneTaskTotal(String clusterName, String executionUser) throws UnExpectedRequestException, MetaDataAcquireFailedException { + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + long curSysTime = System.currentTimeMillis(); + + long deadtime = curSysTime - linkisConfig.getUnDoneDays()*24*60*60*1000L; + + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUnDone()) + .queryParam("engineType", linkisConfig.getEngineName()) + .queryParam("creator", linkisConfig.getAppName()) + .queryParam("startDate", deadtime) + .queryParam("endDate", curSysTime) + .toString(); + + Map response = gainResponseLinkisByGet(clusterInfo, executionUser, url, "get undone task total."); + + Integer undoneTaskTotal = ((Integer) ((Map) response.get("data")).get("totalPage")); + + if (undoneTaskTotal != null) { + LOGGER.info("Curent undone task num: " + undoneTaskTotal.intValue()); + return undoneTaskTotal.intValue(); + } + return Integer.parseInt("0"); + } + + @Override + public LinkisDataSourceInfoDetail getDataSourceInfoById(String clusterName, String userName, Long dataSourceId) throws Exception { + GeneralResponse> generalResponse = getDataSourceInfoDetail(clusterName, userName, dataSourceId, null); + if (MapUtils.isEmpty(generalResponse.getData()) || !generalResponse.getData().containsKey(INFO)) { + throw new MetaDataAcquireFailedException("Failed to acquire data source by id"); + } + Map infoMap = (Map) generalResponse.getData().get("info"); + ObjectMapper objectMapper = new ObjectMapper(); + String infoJson = objectMapper.writeValueAsString(infoMap); + return objectMapper.readValue(infoJson, LinkisDataSourceInfoDetail.class); + } + + @Override + public Long addUdf(String currentCluster, String userName, Map requestBody) throws UnExpectedRequestException, IOException, JSONException, MetaDataAcquireFailedException { + ClusterInfo clusterInfo = checkClusterNameExists(currentCluster); + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUdfAdd()).toString(); + Map response = gainResponseLinkisByPostBringJson(clusterInfo, userName, url, "add udf with linkis api", new JSONObject(OBJECT_MAPPER.writeValueAsString(requestBody))); + + if (checkResponse(response)) { + // Get real ID + return ((Integer) ((Map) response.get("data")).get("udfId")).longValue(); + } + return null; + } + + @Override + public void modifyUdf(String currentCluster, String userName, Map requestBody) throws UnExpectedRequestException, IOException, JSONException, MetaDataAcquireFailedException { + ClusterInfo clusterInfo = checkClusterNameExists(currentCluster); + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUdfModify()).toString(); + Map response = gainResponseLinkisByPostBringJson(clusterInfo, userName, url, "modify udf with linkis api", new JSONObject(OBJECT_MAPPER.writeValueAsString(requestBody))); + + if (checkResponse(response)) { + LOGGER.info(""); + } + LOGGER.info(""); + } + + @Override + public String checkFilePathExistsAndUploadToWorkspace(String currentCluster, String userName, File uploadFile, Boolean needUpload) throws UnExpectedRequestException, MetaDataAcquireFailedException, IOException, JSONException { + ClusterInfo clusterInfo = checkClusterNameExists(currentCluster); + String targetFilePath = new StringBuffer(linkisConfig.getUploadWorkspacePrefix()) + .append(File.separator).append(userName).append(File.separator).append("qualitis").toString(); + String getPathUrl = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUploadDir()).queryParam("path", targetFilePath).toString(); + try { + getPathUrl = URLDecoder.decode(getPathUrl, "UTF-8"); + } catch (UnsupportedEncodingException e) { + LOGGER.error(e.getMessage(), e); + throw new UnExpectedRequestException("Decode get path url exception", 500); + } + Map responseMap = gainResponseLinkisByGet(clusterInfo, userName, getPathUrl, "Check file path."); + + Map dirFileTrees = (Map) ((Map) responseMap.get("data")).get("dirFileTrees"); + if (dirFileTrees == null) { + Map requestBody = Maps.newHashMapWithExpectedSize(1); + requestBody.put("path", targetFilePath); + String createPathUrl = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUploadCreateDir()).toString(); + + Map createResponse = gainResponseLinkisByPostBringJson(clusterInfo, userName, createPathUrl, "Create file path.", new JSONObject(OBJECT_MAPPER.writeValueAsString(requestBody))); + if (checkResponse(createResponse)) { + LOGGER.info("Succeed to create file path."); + } else { + throw new UnExpectedRequestException("Failed to create file path."); + } + } + + if (! needUpload) { + return targetFilePath; + } + + // Upload + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUpload()).toString(); + HttpPost httppost = new HttpPost(url); + CloseableHttpClient httpclient = HttpClients.createDefault(); + MultipartEntityBuilder multipartEntityBuilder = MultipartEntityBuilder.create(); + multipartEntityBuilder.setContentType(ContentType.MULTIPART_FORM_DATA); + multipartEntityBuilder.setCharset(Charset.forName("UTF-8")); + multipartEntityBuilder.addBinaryBody("file", uploadFile); + multipartEntityBuilder.addTextBody("path", "file://" + targetFilePath); + + httppost.addHeader("Token-User", userName); + httppost.addHeader("Token-Code", clusterInfo.getLinkisToken()); + + httppost.setEntity(multipartEntityBuilder.build()); + CloseableHttpResponse response = null; + try { + response = httpclient.execute(httppost); + int code = response.getStatusLine().getStatusCode(); + if (code != HttpStatus.SC_OK) { + throw new UnExpectedRequestException("{&FAILED_TO_CALL_UPLOAD_API}"); + } + } catch (IOException e) { + throw new UnExpectedRequestException("{&FAILED_TO_CALL_UPLOAD_API}"); + } finally { + if (response != null) { + try { + response.close(); + } catch (IOException e) { + LOGGER.error(e.getMessage(), e); + } + } + + try { + httpclient.close(); + } catch (IOException e) { + LOGGER.error(e.getMessage(), e); + } + } + + return targetFilePath; + } + + @Override + public Long clientAdd(String currentCluster, String targetFilePath, File uploadFile, String fileName, String udfDesc, String udfName, String returnType + , String enter, String registerName, Boolean status, String dir) throws MetaDataAcquireFailedException, UnExpectedRequestException, JSONException, IOException { + boolean commonJar = fileName.endsWith(".jar"); + boolean pythonScript = fileName.endsWith(".py"); + + Map requestBody = new HashMap<>(1); + Map requestInBody = new HashMap<>(10); + requestInBody.put("path", targetFilePath + File.separator + uploadFile.getName()); + requestInBody.put("description", udfDesc); + requestInBody.put("udfName", udfName); + + // Support script type. 1 for spark python, 2 for spark scala. + if (commonJar) { + requestInBody.put("udfType", 0); + requestInBody.put("useFormat", returnType + " " + udfName + "(" + enter + ")"); + requestInBody.put("registerFormat", "create temporary function " + udfName + " as \"" + registerName + "\""); + } else if (pythonScript) { + requestInBody.put("udfType", 1); + requestInBody.put("useFormat", returnType + " " + udfName + "(" + enter + ")"); + requestInBody.put("registerFormat", "udf.register(\"" + udfName + "\", " + registerName + ")"); + } else { + requestInBody.put("udfType", 2); + requestInBody.put("useFormat", udfName + "()"); + requestInBody.put("registerFormat", "sqlContext.udf.register[" + returnType + "," + enter + "](\"" + udfName + "\"," + registerName + ")"); + } + + requestInBody.put("clusterName", "All"); + requestInBody.put("load", status); + requestInBody.put("directory", dir); + requestInBody.put("sys", linkisConfig.getAppName()); + + requestBody.put("udfAddVo", requestInBody); + Long udfId = addUdf(currentCluster, linkisConfig.getUdfAdmin(), requestBody); + + return udfId; + } + + @Override + public void clientModify(String targetFilePath, File uploadFile, String currentCluster, Map clusterIdMaps, String fileName, String udfDesc + , String udfName, String returnType, String enter, String registerName) throws MetaDataAcquireFailedException, UnExpectedRequestException, JSONException, IOException { + boolean commonJar = fileName.endsWith(".jar"); + boolean pythonScript = fileName.endsWith(".py"); + + Map requestBody = new HashMap<>(1); + Map requestInBody = new HashMap<>(7); + requestInBody.put("path", targetFilePath + File.separator + uploadFile.getName()); + requestInBody.put("id", clusterIdMaps.get(currentCluster)); + requestInBody.put("description", udfDesc); + requestInBody.put("udfName", udfName); + if (commonJar) { + requestInBody.put("udfType", 0); + requestInBody.put("useFormat", returnType + " " + udfName + "(" + enter + ")"); + requestInBody.put("registerFormat", "create temporary function " + udfName + " as \"" + registerName + "\""); + } else if (pythonScript) { + requestInBody.put("udfType", 1); + requestInBody.put("useFormat", returnType + " " + udfName + "(" + enter + ")"); + requestInBody.put("registerFormat", "udf.register(\"" + udfName + "\", " + registerName + ")"); + } else { + requestInBody.put("udfType", 2); + requestInBody.put("useFormat", udfName + "()"); + requestInBody.put("registerFormat", "sqlContext.udf.register[" + returnType + "," + enter + "](\"" + udfName + "\"," + registerName + ")"); + } + requestBody.put("udfUpdateVo", requestInBody); + modifyUdf(currentCluster, linkisConfig.getUdfAdmin(), requestBody); + } + + @Override + public void shareAndDeploy(Long udfId, String currentCluster, List proxyUserNames, String udfName) { + if (CollectionUtils.isNotEmpty(proxyUserNames)) { + try { + // Share to proxy user. + LOGGER.info("Start to share udf to proxy users"); + proxyUserNames = proxyUserNames.stream().filter(proxyUserName -> ! proxyUserName.equals(linkisConfig.getUdfAdmin())).collect(Collectors.toList()); + shareUdfToProxyUsers(currentCluster, linkisConfig.getUdfAdmin(), proxyUserNames, udfId); + LOGGER.info("Finish to share udf to proxy users"); + + // Deploy new version. + LOGGER.info("Start to get udf new version"); + String version = getUdfNewVersion(currentCluster, linkisConfig.getUdfAdmin(), udfName); + LOGGER.info("Finish to get udf new version: {}", version); + + LOGGER.info("Start to deploy udf new version"); + deployUdfNewVersion(currentCluster, linkisConfig.getUdfAdmin(), udfId, version); + LOGGER.info("Finish to deploy udf new version"); + + LOGGER.info("Start to open udf with every proxy user"); + for (String userName : proxyUserNames) { + switchUdfStatus(currentCluster, udfId, userName, Boolean.TRUE); + } + LOGGER.info("Finish to open udf with every proxy user"); + } catch (Exception e) { + LOGGER.error("Udf operation in backend failed. Please check udf status in linkis console."); + } + } + } + + @Override + public Map getUdfDetail(String currentCluster, String userName, Long linkisUdfId) throws UnExpectedRequestException, MetaDataAcquireFailedException { + ClusterInfo clusterInfo = checkClusterNameExists(currentCluster); + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUdfDetail()).queryParam("udfId", linkisUdfId).toString(); + Map response = gainResponseLinkisByGet(clusterInfo, userName, url, "get udf detail with linkis api"); + + if (checkResponse(response)) { + return response; + } + LOGGER.error(""); + return null; + } + + @Override + public List getDirectory(String category, String clusterName, String userName) throws UnExpectedRequestException, MetaDataAcquireFailedException { + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUdfDirectory()).queryParam("category", category).toString(); + Map response = gainResponseLinkisByGet(clusterInfo, userName, url, "get udf detail with linkis api"); + + if (checkResponse(response)) { + return (List) ((Map) response.get("data")).get("userDirectory"); + } + LOGGER.error("Get directory failed."); + return new ArrayList<>(); + } + + @Override + public void shareUdfToProxyUsers(String clusterName, String userName, List proxyUserNames, Long udfId) throws IOException, JSONException, UnExpectedRequestException, MetaDataAcquireFailedException { + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUdfShare()).toString(); + Map requestBody = new HashMap<>(2); + Map innerRequestBody = new HashMap<>(1); + innerRequestBody.put("id", udfId); + requestBody.put("udfInfo", innerRequestBody); + requestBody.put("sharedUsers", proxyUserNames); + + + Map response = gainResponseLinkisByPostBringJson(clusterInfo, userName, url, "share udf with linkis api", new JSONObject(OBJECT_MAPPER.writeValueAsString(requestBody))); + + if (checkResponse(response)) { + LOGGER.info(""); + return; + } + LOGGER.error(""); + } + + @Override + public void deleteUdf(String clusterName, Long linkisUdfId, String userName, String fileName) throws UnExpectedRequestException, MetaDataAcquireFailedException { + // Step 1 + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUdfDelete()).toString().replace("{UDF_ID}", linkisUdfId.toString()); + + Map response = gainResponseLinkisByPost(clusterInfo, userName, url, "delete udf with linkis api"); + + if (checkResponse(response)) { + LOGGER.info("Succeed to delete udf, start to delete the udf file."); + // Step 2 + String path = new StringBuffer(linkisConfig.getUploadWorkspacePrefix()).append(File.separator).append(userName) + .append(File.separator).append("qualitis") + .append(File.separator).append(fileName).toString(); + String deleteUrl = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getDeleteDir()).toString(); + Map requestBody = Maps.newHashMapWithExpectedSize(1); + requestBody.put("path", path); + + try { + Map deleteResponse = gainResponseLinkisByPostBringJson(clusterInfo, userName, deleteUrl, "delete udf file with linkis api", + new JSONObject(OBJECT_MAPPER.writeValueAsString(requestBody))); + + if (checkResponse(deleteResponse)) { + LOGGER.info("Succeed to delete the udf file."); + return; + } + } catch (Exception e) { + LOGGER.error("Failed to delete the udf file."); + } + } + LOGGER.error("Failed to delete the udf."); + } + + @Override + public void switchUdfStatus(String clusterName, Long linkisUdfId, String userName, Boolean isLoad) throws UnExpectedRequestException, MetaDataAcquireFailedException { + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUdfSwitchStatus()).queryParam("udfId", linkisUdfId).queryParam("isLoad", isLoad).toString(); + + Map response = gainResponseLinkisByGet(clusterInfo, userName, url, "switch udf status with linkis api"); + + if (checkResponse(response)) { + LOGGER.info(""); + return; + } + LOGGER.error(""); + } + + @Override + public String getUdfNewVersion(String clusterName, String userName, String udfName) throws UnExpectedRequestException, IOException, JSONException, MetaDataAcquireFailedException { + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUdfNewVersion()).queryParam("udfName", udfName).queryParam("createUser", userName).toString(); + + Map response = gainResponseLinkisByGet(clusterInfo, userName, url, "get new version with linkis api"); + + if (checkResponse(response)) { + LOGGER.info(""); + return (String) ((Map) ((Map) response.get("data")).get("versionInfo")).get("bmlResourceVersion"); + } + LOGGER.error(""); + return ""; + } + + @Override + public void deployUdfNewVersion(String clusterName, String userName, Long udfId, String version) + throws UnExpectedRequestException, IOException, JSONException, MetaDataAcquireFailedException { + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getUdfPublish()).toString(); + Map requestBody = new HashMap<>(2); + requestBody.put("udfId", udfId); + requestBody.put("version", version); + Map response = gainResponseLinkisByPostBringJson(clusterInfo, userName, url, "deploy new version with linkis api", new JSONObject(OBJECT_MAPPER.writeValueAsString(requestBody))); + + if (checkResponse(response)) { + LOGGER.info(""); + return; + } + LOGGER.error(""); + return; + } + + @Override + public GeneralResponse> deleteEnv(String clusterName, String userName, Long envId) throws UnExpectedRequestException, MetaDataAcquireFailedException { + // Check existence of cluster name + ClusterInfo clusterInfo = checkClusterNameExists(clusterName); + // send request to get dbs + String url = getPath(clusterInfo.getLinkisAddress()).path(linkisConfig.getEnvDelete()).toString().replace("{ENV_ID}", envId.toString()); + Map response = gainResponseLinkisByDelete(clusterInfo, userName, url, "delete env by user and cluster by linkis."); + + Map data = (Map) response.get(LinkisResponseKeyEnum.DATA.getKey()); + return new GeneralResponse<>("200", "Success to delete env", data); + } + private ClusterInfo checkClusterNameExists(String clusterName) throws UnExpectedRequestException { - ClusterInfo clusterInfo = clusterInfoDao.findByClusterName(clusterName); - if (clusterInfo == null) { - throw new UnExpectedRequestException(String.format("%s 集群名称不存在", clusterName)); + ClusterInfo currentClusterInfo = clusterInfoCache.getIfPresent(clusterName); + + if (currentClusterInfo != null) { + LOGGER.info("Getting cluster from local cache, key: {}", clusterName); + } else { + ClusterInfo clusterInfo = clusterInfoDao.findByClusterName(clusterName); + if (clusterInfo == null) { + throw new UnExpectedRequestException(String.format("%s 集群名称不存在", clusterName)); + } + clusterInfoCache.put(clusterName, clusterInfo); + return clusterInfo; } - return clusterInfo; + return currentClusterInfo; } private UriBuilder getPath(String linkisAddress) { return UriBuilder.fromUri(linkisAddress).path(linkisConfig.getPrefix()); } - private boolean checkResponse(Map response) { - Integer responseStatus = (Integer) response.get("status"); + private boolean checkResponse(Map response) { + if (null == response.get(STATUS)) { + return false; + } + Integer responseStatus = (Integer) response.get(STATUS); return responseStatus == 0; } diff --git a/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/OperateCiServiceImpl.java b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/OperateCiServiceImpl.java new file mode 100644 index 00000000..81624dd5 --- /dev/null +++ b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/OperateCiServiceImpl.java @@ -0,0 +1,354 @@ +package com.webank.wedatasphere.qualitis.client.impl; + +import com.google.common.collect.Lists; +import com.webank.wedatasphere.qualitis.client.config.MetricPropertiesConfig; +import com.webank.wedatasphere.qualitis.client.config.OperateCiConfig; +import com.webank.wedatasphere.qualitis.client.constant.OperateEnum; +import com.webank.wedatasphere.qualitis.client.request.OperateRequest; +import com.webank.wedatasphere.qualitis.constant.SpecCharEnum; +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; +import com.webank.wedatasphere.qualitis.metadata.client.OperateCiService; +import com.webank.wedatasphere.qualitis.metadata.response.*; +import com.webank.wedatasphere.qualitis.response.GeneralResponse; +import org.apache.commons.collections.CollectionUtils; +import org.apache.commons.collections.MapUtils; +import org.apache.commons.lang3.StringUtils; +import org.codehaus.jackson.map.ObjectMapper; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.beans.factory.annotation.Value; +import org.springframework.http.HttpEntity; +import org.springframework.http.HttpHeaders; +import org.springframework.http.HttpMethod; +import org.springframework.http.MediaType; +import org.springframework.stereotype.Service; +import org.springframework.web.client.ResourceAccessException; +import org.springframework.web.client.RestTemplate; + +import javax.ws.rs.core.UriBuilder; +import java.io.IOException; +import java.util.*; + +/** + * @author allenzhou@webank.com + * @date 2021/3/2 10:58 + */ +@Service +public class OperateCiServiceImpl implements OperateCiService { + @Autowired + private MetricPropertiesConfig metricPropertiesConfig; + + @Autowired + private OperateCiConfig operateCiConfig; + + @Autowired + private RestTemplate restTemplate; + + @Value("${department.data_source_from: custom}") + private String departmentSourceType; + + @Value("${deploy.environment: open_source}") + private String deployEnvType; + + private static final Logger LOGGER = LoggerFactory.getLogger(OperateCiServiceImpl.class); + + @Override + public List getAllSubSystemInfo() throws UnExpectedRequestException { +// 仅限开源环境 + if ("open_source".equals(deployEnvType)) { + return Collections.emptyList(); + } + + Map response = requestCmdb(OperateEnum.SUB_SYSTEM, "A problem occurred when converting the request body to json.", "{&FAILED_TO_GET_SUB_SYSTEM_INFO}", "Start to get sub_system info from cmdb. url: {}, method: {}, body: {}", "Succeed to get sub_system info from cmdb. response."); + + List content = checkResponse(response); + + List responses = new ArrayList<>(content.size()); + + for (int i = 0; i < content.size(); i++) { + SubSystemResponse tempResponse = new SubSystemResponse(); + Object current = content.get(i); + + Integer currentSubsystemId = ((Map) current).get("subsystem_id"); + tempResponse.setSubSystemId(currentSubsystemId); + + String currentSubSystemName = ((Map) current).get("subsystem_name"); + tempResponse.setSubSystemName(currentSubSystemName); + + String currentFullCnmName = ((Map) current).get("full_cn_name"); + tempResponse.setSubSystemFullCnName(currentFullCnmName); + + List> opsList = (List) ((Map) current).get("pro_oper_group"); + List> deptList = (List) ((Map) current).get("busiResDept"); + List> devList = (List) ((Map) current).get("devdept"); + + String dept = ""; + String opsDept = ""; + String devDept = ""; + + if (CollectionUtils.isNotEmpty(deptList)) { + dept = (String) (deptList.iterator().next()).get("v"); + tempResponse.setDepartmentName(dept); + } + + if (CollectionUtils.isNotEmpty(opsList)) { + opsDept = (String) (opsList.iterator().next()).get("v"); + if (StringUtils.isEmpty(dept)) { + String[] infos = opsDept.split(SpecCharEnum.MINUS.getValue()); + if (infos.length == 2) { + dept = infos[0]; + tempResponse.setDepartmentName(dept); + tempResponse.setOpsDepartmentName(infos[1]); + } else { + tempResponse.setOpsDepartmentName(infos[0]); + } + } else { + tempResponse.setOpsDepartmentName(opsDept.replace(dept + "-", "")); + } + } + + if (CollectionUtils.isNotEmpty(devList)) { + devDept = (String) (devList.iterator().next()).get("v"); + if (StringUtils.isEmpty(dept)) { + String[] infos = devDept.split(SpecCharEnum.MINUS.getValue()); + if (infos.length == 2) { + dept = infos[0]; + tempResponse.setDepartmentName(dept); + tempResponse.setDevDepartmentName(infos[1]); + } else { + tempResponse.setDevDepartmentName(infos[0]); + } + } else { + tempResponse.setDevDepartmentName(devDept.replace(dept + "-", "")); + } + } + + responses.add(tempResponse); + } + + return responses; + } + + private Map requestCmdb(OperateEnum subSystem, String problemDescribe, String international, String requestInfo, String successInfo) throws UnExpectedRequestException { + String url = UriBuilder.fromUri(operateCiConfig.getHost()).path(operateCiConfig.getUrl()).toString(); + + HttpHeaders headers = new HttpHeaders(); + headers.setContentType(MediaType.APPLICATION_JSON); + ObjectMapper objectMapper = new ObjectMapper(); + // Construct request body. + OperateRequest request = new OperateRequest(subSystem.getCode()); + request.setUserAuthKey(operateCiConfig.getUserAuthKey()); + HttpEntity entity = null; + try { + String jsonRequest = objectMapper.writeValueAsString(request); + LOGGER.info("Operate request: {}", jsonRequest); + entity = new HttpEntity<>(jsonRequest, headers); + } catch (IOException e) { + LOGGER.error(problemDescribe); + throw new UnExpectedRequestException(international); + } + LOGGER.info(requestInfo, url, javax.ws.rs.HttpMethod.POST, entity); + Map response = restTemplate.postForObject(url, entity, Map.class); + LOGGER.info(successInfo); + return response; + } + + private List checkResponse(Map response) + throws UnExpectedRequestException { + int code = (int) ((Map) response.get("headers")).get("retCode"); + + List content = (List) ((Map) response.get("data")).get("content"); + int size = (int) ((Map) response.get("headers")).get("contentRows"); + LOGGER.info("Num of operate info is: {}", size); + + if (0 == code && content.size() == size) { + return content; + } else { + throw new UnExpectedRequestException("The result of operate info is not correct."); + } + } + + @Override + public List getAllProductInfo() + throws UnExpectedRequestException { + if ("open_source".equals(deployEnvType)) { + return Collections.emptyList(); + } + + String url = UriBuilder.fromUri(operateCiConfig.getHost()).path(operateCiConfig.getUrl()).toString(); + + HttpHeaders headers = new HttpHeaders(); + headers.setContentType(MediaType.APPLICATION_JSON); + ObjectMapper objectMapper = new ObjectMapper(); + // Construct request body. + OperateRequest request = new OperateRequest(OperateEnum.PRODUCT.getCode()); + request.setUserAuthKey(operateCiConfig.getUserAuthKey()); + HttpEntity entity = null; + try { + String jsonRequest = objectMapper.writeValueAsString(request); + LOGGER.info("Operate request: {}", jsonRequest); + entity = new HttpEntity<>(jsonRequest, headers); + } catch (IOException e) { + LOGGER.error("A problem occurred when converting the request body to json. "); + throw new UnExpectedRequestException("{&FAILED_TO_GET_PRODUCT_INFO}"); + } + LOGGER.info("Start to get product info from cmdb. url: {}, method: {}, body: {}", url, javax.ws.rs.HttpMethod.POST, entity); + Map response = restTemplate.postForObject(url, entity, Map.class); + LOGGER.info("Succeed to get product info from cmdb. response."); + + List content = checkResponse(response); + + List responses = new ArrayList<>(content.size()); + for (int i = 0; i < content.size(); i++) { + ProductResponse tempResponse = new ProductResponse(); + Object current = content.get(i); + + String currentProductId = ((Map) current).get("product_cd"); + tempResponse.setProductId(currentProductId); + + String currentProductName = ((Map) current).get("cn_name"); + tempResponse.setProductName(currentProductName); + + responses.add(tempResponse); + } + + return responses; + } + + @Override + public List getAllDepartmetInfo() throws UnExpectedRequestException { + Integer pId = 100000; + LOGGER.info("Start to get department info from esb. PID: {}", pId); + List departmentSubResponses = getDevAndOpsInfo(pId); + LOGGER.info("Succeed to get department response from esb."); + + List responses = Lists.newArrayListWithCapacity(departmentSubResponses.size()); + List departmentList = Arrays.asList(metricPropertiesConfig.getWhiteList().split(SpecCharEnum.COMMA.getValue())); + + for (int i = 0; i < departmentSubResponses.size(); i++) { + DepartmentSubResponse current = departmentSubResponses.get(i); + + String departmentCode = current.getId(); + String departmentName = current.getName(); + LOGGER.info("The [{}]-th department name is [{}]", i, departmentName); + CmdbDepartmentResponse cmdbDepartmentResponse = new CmdbDepartmentResponse(); + cmdbDepartmentResponse.setCode(departmentCode); + cmdbDepartmentResponse.setName(departmentName); + if (departmentList.contains(departmentName)) { + cmdbDepartmentResponse.setDisable("1"); + } else { + cmdbDepartmentResponse.setDisable("0"); + } + + responses.add(cmdbDepartmentResponse); + } + + return responses; + } + + @Override + public List getDevAndOpsInfo(Integer deptCode) throws UnExpectedRequestException { + String url = UriBuilder.fromUri(operateCiConfig.getEfHost()).path(operateCiConfig.getEfUrl()) + .queryParam("AppToken", operateCiConfig.getEfAppToken()) + .queryParam("AppId", operateCiConfig.getEfAppId()) + .toString(); + + HttpHeaders headers = new HttpHeaders(); + headers.setContentType(MediaType.APPLICATION_JSON); + HttpEntity entity = new HttpEntity<>(headers); + Map response; + try { + LOGGER.info("Start to get dev and ops info from ef. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class).getBody(); + } catch (ResourceAccessException e) { + LOGGER.error(e.getMessage(), e); + throw new UnExpectedRequestException("{&FAILED_TO_GET_DEPARTMENT_INFO}"); + } + LOGGER.info("Finish to get dev and ops info from ef. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + Integer responseCode = (Integer) response.get("Code"); + if (responseCode == null || !responseCode.equals(0)) { + throw new UnExpectedRequestException("The result of dev and ops info from ef is not correct."); + } + + LOGGER.info("Success to get dev and ops info from ef. url: {}, method: {}, body: {}", url, HttpMethod.GET, entity); + List> content = (List>) ((Map) response.get("Result")).get("Data"); + List responses = new ArrayList<>(128); + for (int i = 0; i < content.size(); i++) { + DepartmentSubResponse departmentSubResponse = new DepartmentSubResponse(); + Map current = content.get(i); + Integer pId = (Integer) current.get("PID"); + if (pId == null || !deptCode.equals(pId)) { + continue; + } + + String orgName = (String) current.get("OrgName"); + Object id = current.get("ID"); + departmentSubResponse.setId(id.toString()); + departmentSubResponse.setName(orgName); + responses.add(departmentSubResponse); + } + + return responses; + } + + @Override + public GeneralResponse getDcn(Long subSystemId) throws UnExpectedRequestException { + String url = UriBuilder.fromUri(operateCiConfig.getHost()).path(operateCiConfig.getIntegrateUrl()).toString(); + + HttpHeaders headers = new HttpHeaders(); + headers.setContentType(MediaType.APPLICATION_JSON); + ObjectMapper objectMapper = new ObjectMapper(); + // Construct request body. + OperateRequest request = new OperateRequest(OperateEnum.SUB_SYSTEM_FIND_DCN.getCode()); + request.setUserAuthKey(operateCiConfig.getNewUserAuthKey()); + request.getFilter().put("subsystem_id", subSystemId.toString()); + HttpEntity entity; + try { + String jsonRequest = objectMapper.writeValueAsString(request); + LOGGER.info("Operate request: {}", jsonRequest); + entity = new HttpEntity<>(jsonRequest, headers); + } catch (IOException e) { + LOGGER.error("Failed to get dcn by subsystem."); + throw new UnExpectedRequestException("Failed to get dcn by subsystem."); + } + LOGGER.info("Start to get dcn by subsystem.", url, javax.ws.rs.HttpMethod.POST, entity); + Map response = restTemplate.postForObject(url, entity, Map.class); + LOGGER.info("Finished to get dcn by subsystem."); + + DcnResponse dcnResponse = new DcnResponse((List>) ((Map) response.get("data")).get("content")); + + // Filter MASTER + if (Boolean.TRUE.equals(operateCiConfig.getOnlySlave())) { + filterDcn(dcnResponse); + } + + return new GeneralResponse<>("200", "Success to get dcn by subsystem", dcnResponse); + } + + private void filterDcn(DcnResponse dcnResponse) { + Map>>> resMap = dcnResponse.getRes(); + Iterator>>>> resIterator = resMap.entrySet().iterator(); + while (resIterator.hasNext()) { + Map.Entry>>> res = resIterator.next(); + Map>> dcnMap = res.getValue(); + Iterator>>> dcnIterator = dcnMap.entrySet().iterator(); + while (dcnIterator.hasNext()) { + List> logicDcns = dcnIterator.next().getValue(); + ListIterator> logicDcnIterator = logicDcns.listIterator(); + while (logicDcnIterator.hasNext()) { + Map dcn = logicDcnIterator.next(); + if ("MASTER".equals(dcn.get("set_type"))) { + logicDcnIterator.remove(); + } + } + if (CollectionUtils.isEmpty(logicDcns)) { + dcnIterator.remove(); + } + } + if (MapUtils.isEmpty(dcnMap)) { + resIterator.remove(); + } + } + } +} diff --git a/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/RuleClientImpl.java b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/RuleClientImpl.java new file mode 100644 index 00000000..4a04c1e6 --- /dev/null +++ b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/impl/RuleClientImpl.java @@ -0,0 +1,205 @@ +package com.webank.wedatasphere.qualitis.client.impl; + +import com.google.gson.Gson; +import com.google.gson.reflect.TypeToken; +import com.webank.wedatasphere.qualitis.client.config.DataMapConfig; +import com.webank.wedatasphere.qualitis.encoder.Sha256Encoder; +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; +import com.webank.wedatasphere.qualitis.metadata.client.RuleClient; +import com.webank.wedatasphere.qualitis.metadata.exception.MetaDataAcquireFailedException; +import com.webank.wedatasphere.qualitis.metadata.response.DataInfo; +import com.webank.wedatasphere.qualitis.metadata.response.DataMapResultInfo; +import com.webank.wedatasphere.qualitis.metadata.response.table.TableMetadataInfo; +import com.webank.wedatasphere.qualitis.metadata.response.table.TableTagInfo; +import com.webank.wedatasphere.qualitis.util.UuidGenerator; +import org.apache.commons.collections.CollectionUtils; +import org.apache.commons.lang3.StringUtils; +import org.apache.http.HttpStatus; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.http.HttpHeaders; +import org.springframework.http.HttpMethod; +import org.springframework.stereotype.Component; +import org.springframework.web.client.ResourceAccessException; +import org.springframework.web.client.RestTemplate; + +import javax.ws.rs.core.UriBuilder; +import java.util.List; +import java.util.Map; +import java.util.Objects; +import java.util.Optional; +import java.util.stream.Collectors; + +/** + * @author v_minminghe@webank.com + * @date 2022-05-31 9:40 + * @description + */ +@Component +public class RuleClientImpl implements RuleClient { + + private static final Logger LOGGER = LoggerFactory.getLogger(RuleClientImpl.class); + @Autowired + private DataMapConfig dataMapConfig; + @Autowired + private RestTemplate restTemplate; + + @Override + public TableTagInfo getTableTag(String sourceType, String clusterType, String dbName, String tableName, String loginUser) throws MetaDataAcquireFailedException, ResourceAccessException, UnExpectedRequestException { + validateParameter(sourceType, clusterType, dbName, tableName); + + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getDatasetTagRelationsPath()); + uriBuilder.queryParam("sourceType", sourceType); + uriBuilder.queryParam("clusterType", clusterType); + uriBuilder.queryParam("dbCode", dbName); + uriBuilder.queryParam("datasetName", tableName); + constructUrlWithSignature(uriBuilder, loginUser); + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + LOGGER.info("Start to get table tag by dms. url: {}, method: {},", uriBuilder, HttpMethod.GET); + String response = restTemplate.getForObject(uriBuilder + "&confirmUser&tagCode", String.class); + LOGGER.info("Finish to get table tag by dms. response: {}", response); + if (StringUtils.isEmpty(response)) { + throw new MetaDataAcquireFailedException("Error!network occurred an unexpected error", 200); + } + return convertTableTagInfo(response); + } + + private TableTagInfo convertTableTagInfo(String response) throws MetaDataAcquireFailedException { + Gson gson = new Gson(); + DataMapResultInfo> dataMapResultInfo = gson.fromJson(response, + new TypeToken>>() { + }.getType()); + if (!String.valueOf(HttpStatus.SC_OK).equals(dataMapResultInfo.getCode())) { + throw new MetaDataAcquireFailedException("Error! Can not get table tag from DataMap, message: " + dataMapResultInfo.getMsg(), 200); + } + DataInfo data = dataMapResultInfo.getData(); + if (Objects.nonNull(data)) { + List content = data.getContent(); + if (CollectionUtils.isNotEmpty(content)) { + return content.get(0); + } + } + return TableTagInfo.build(); + } + + @Override + public TableMetadataInfo getMetaData(String sourceType, String clusterType, String dbName, String tableName, String loginUser) throws MetaDataAcquireFailedException, ResourceAccessException, UnExpectedRequestException { + validateParameter(sourceType, clusterType, dbName, tableName); + + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getQueryAllPath()); + uriBuilder.queryParam("searchType", "TABLE"); + uriBuilder.queryParam("sourceType", sourceType.substring(0, 1).toUpperCase() + sourceType.substring(1)); + uriBuilder.queryParam("searchKey", tableName); + uriBuilder.queryParam("clusterType", clusterType); + uriBuilder.queryParam("pageNo", 1); + uriBuilder.queryParam("pageSize", 10); + constructUrlWithSignature(uriBuilder, loginUser); + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + LOGGER.info("Start to get metaData table by dms. url: {}, method: {}", uriBuilder, HttpMethod.GET); + String response = restTemplate.getForObject(uriBuilder.toString(), String.class); + LOGGER.info("Finish to get metaData table by dms. response: {}", response); + if (StringUtils.isEmpty(response)) { + throw new MetaDataAcquireFailedException("Error!network occurred an unexpected error", 200); + } + return convertTableMetadataInfo(response, clusterType, dbName, tableName); + } + + private TableMetadataInfo convertTableMetadataInfo(String response, String clusterType, String dbName, String tableName) throws MetaDataAcquireFailedException { + Gson gson = new Gson(); + DataMapResultInfo> dataMapResultInfo = gson.fromJson(response, new TypeToken>>() { + }.getType()); + if (!String.valueOf(HttpStatus.SC_OK).equals(dataMapResultInfo.getCode())) { + throw new MetaDataAcquireFailedException("Error!Can not get metaData from DataMap, message: " + dataMapResultInfo.getMsg(), 200); + } + Map dataMap = dataMapResultInfo.getData(); + if (Objects.nonNull(dataMap)) { + Object valList = dataMap.get("valList"); + List deptInfoList = gson.fromJson(gson.toJson(valList), new TypeToken>() { + }.getType()); + if (CollectionUtils.isNotEmpty(deptInfoList)) { + Optional optional = deptInfoList.stream().filter(result -> verificationResult(result, clusterType, dbName, tableName)).findFirst(); + if (optional.isPresent()) { + return optional.get(); + } + } + } + return TableMetadataInfo.build(); + } + + private boolean verificationResult(TableMetadataInfo result, String clusterType, String dbName, String tableName) { + List> pathList = result.getPathList(); + if (CollectionUtils.isNotEmpty(pathList)) { + List pathItemList = pathList.stream().map(map -> map.get("name")).collect(Collectors.toList()); + if (StringUtils.isEmpty(dbName)) { + return pathItemList.contains(clusterType) && tableName.equals(result.getRawName()); + } + return pathItemList.contains(clusterType) && pathItemList.contains(dbName) && tableName.equals(result.getRawName()); + } + return false; + } + + @Override + public DataInfo getTagList(String loginUser, int page, int size) throws MetaDataAcquireFailedException { + UriBuilder uriBuilder = UriBuilder.fromUri(dataMapConfig.getAddress()) + .path(dataMapConfig.getTagsPath()); + uriBuilder.queryParam("pageNo", page); + uriBuilder.queryParam("pageSize", size); + constructUrlWithSignature(uriBuilder, loginUser); + + HttpHeaders headers = new HttpHeaders(); + headers.add("isAuth", String.valueOf(false)); + LOGGER.info("Start to get table tag from dms. url: {}, method: {}", uriBuilder, HttpMethod.GET); + String response = restTemplate.getForObject(uriBuilder.toString(), String.class); + LOGGER.info("Finish to get table tag from dms. response: {}", response); + if (StringUtils.isEmpty(response)) { + throw new MetaDataAcquireFailedException("Error!network occurred an unexpected error", 200); + } + return convertTagInfo(response); + } + + private DataInfo convertTagInfo(String response) throws MetaDataAcquireFailedException { + Gson gson = new Gson(); + DataMapResultInfo> dataMapResultInfo = gson.fromJson(response, + new TypeToken>>() { + }.getType()); + if (!String.valueOf(HttpStatus.SC_OK).equals(dataMapResultInfo.getCode())) { + throw new MetaDataAcquireFailedException("Error! Can not get table tag from dms, message: " + dataMapResultInfo.getMsg(), 200); + } + return dataMapResultInfo.getData(); + } + + private void validateParameter(String sourceType, String clusterType, String dbName, String tableName) throws UnExpectedRequestException { + if (StringUtils.isEmpty(sourceType) + || StringUtils.isEmpty(clusterType) + || StringUtils.isEmpty(dbName) + || StringUtils.isEmpty(tableName)) { + throw new UnExpectedRequestException("parameter must be not null"); + } + } + + /** + * if execute a large number of data, maybe nonce conflict, cause http 403 + * @param uriBuilder + * @param loginUser + */ + private void constructUrlWithSignature(UriBuilder uriBuilder, String loginUser) { + String nonce = UuidGenerator.generateRandom(5); + String timestamp = String.valueOf(System.currentTimeMillis()); + String signature = Sha256Encoder.encode(Sha256Encoder.encode(dataMapConfig.getAppId() + nonce + loginUser + timestamp) + dataMapConfig.getAppToken()); + uriBuilder.queryParam("appid", dataMapConfig.getAppId()); + uriBuilder.queryParam("nonce", nonce); + uriBuilder.queryParam("timestamp", timestamp); + uriBuilder.queryParam("loginUser", loginUser); + uriBuilder.queryParam("signature", signature); + uriBuilder.queryParam("isolateEnvFlag", dataMapConfig.getIsolateEnvFlag()); + } + + +} diff --git a/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/request/AskLinkisParameter.java b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/request/AskLinkisParameter.java new file mode 100644 index 00000000..2de14c01 --- /dev/null +++ b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/request/AskLinkisParameter.java @@ -0,0 +1,54 @@ +package com.webank.wedatasphere.qualitis.client.request; + +/** + * @author v_gaojiedeng + */ +public class AskLinkisParameter { + + private String url; + private String linkisToken; + private String authUser; + private String logmessage; + + public AskLinkisParameter() { + } + + public AskLinkisParameter(String url, String linkisToken, String authUser, String logmessage) { + this.url = url; + this.linkisToken = linkisToken; + this.authUser = authUser; + this.logmessage = logmessage; + } + + public String getLogmessage() { + return logmessage; + } + + public void setLogmessage(String logmessage) { + this.logmessage = logmessage; + } + + public String getUrl() { + return url; + } + + public void setUrl(String url) { + this.url = url; + } + + public String getLinkisToken() { + return linkisToken; + } + + public void setLinkisToken(String linkisToken) { + this.linkisToken = linkisToken; + } + + public String getAuthUser() { + return authUser; + } + + public void setAuthUser(String authUser) { + this.authUser = authUser; + } +} diff --git a/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/request/OperateRequest.java b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/request/OperateRequest.java new file mode 100644 index 00000000..aa92b0f1 --- /dev/null +++ b/core/meta_data/src/main/java/com/webank/wedatasphere/qualitis/client/request/OperateRequest.java @@ -0,0 +1,158 @@ +package com.webank.wedatasphere.qualitis.client.request; + +import com.webank.wedatasphere.qualitis.client.constant.OperateEnum; +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +/** + * @author allenzhou@webank.com + * @date 2021/3/2 14:47 + */ +public class OperateRequest { + private String userAuthKey; + private String type; + private int startIndex; + private int pageSize; + private String action; + private Boolean isPaging; + private List resultColumn; + private Map filter; + + public OperateRequest(int code) { + if (code == OperateEnum.SUB_SYSTEM.getCode()) { + type = "wb_subsystem"; + startIndex = 0; + pageSize = Integer.MAX_VALUE; + action = "select"; + isPaging = false; + resultColumn = new ArrayList<>(3); + resultColumn.add("subsystem_id"); + resultColumn.add("subsystem_name"); + resultColumn.add("full_cn_name"); + resultColumn.add("devdept"); + resultColumn.add("busiResDept"); + resultColumn.add("pro_oper_group"); + } else if (code == OperateEnum.PRODUCT.getCode()) { + type = "wb_product_cd"; + startIndex = 0; + pageSize = Integer.MAX_VALUE; + action = "select"; + isPaging = false; + resultColumn = new ArrayList<>(2); + resultColumn.add("product_cd"); + resultColumn.add("cn_name"); + } else if (code == OperateEnum.DEPARTMENT.getCode()) { + type = "wb_sec_level_dep"; + startIndex = 0; + pageSize = Integer.MAX_VALUE; + action = "select"; + isPaging = false; + resultColumn = new ArrayList<>(2); + resultColumn.add("dep_name"); + resultColumn.add("dep_code"); + } else if (code == OperateEnum.DEV_DEPARTMENT.getCode()) { + type = "wb_dev_oper_group"; + startIndex = 0; + pageSize = Integer.MAX_VALUE; + action = "select"; + isPaging = false; + resultColumn = new ArrayList<>(1); + resultColumn.add("group_name"); + } else if (code == OperateEnum.OPS_DEPARTMENT.getCode()) { + type = "wb_test_oper_group"; + startIndex = 0; + pageSize = Integer.MAX_VALUE; + action = "select"; + isPaging = false; + resultColumn = new ArrayList<>(1); + resultColumn.add("group_name"); + } else if (code == OperateEnum.SUB_SYSTEM_FIND_DCN.getCode()) { + filter = new HashMap<>(); + type = "DB_subsystem_tdsql"; + startIndex = 0; + pageSize = Integer.MAX_VALUE; + action = "select"; + isPaging = false; + resultColumn = new ArrayList<>(1); + resultColumn.add("db_name"); + resultColumn.add("idc"); + resultColumn.add("phy_set_name"); + resultColumn.add("set_type"); + resultColumn.add("vip"); + resultColumn.add("logic_area"); + resultColumn.add("gwport"); + resultColumn.add("dbinstance_name"); + resultColumn.add("clu_name"); + resultColumn.add("set_name"); + resultColumn.add("dcn_num"); + resultColumn.add("logic_dcn"); + } + + } + + public String getUserAuthKey() { + return userAuthKey; + } + + public void setUserAuthKey(String userAuthKey) { + this.userAuthKey = userAuthKey; + } + + public String getType() { + return type; + } + + public void setType(String type) { + this.type = type; + } + + public int getStartIndex() { + return startIndex; + } + + public void setStartIndex(int startIndex) { + this.startIndex = startIndex; + } + + public int getPageSize() { + return pageSize; + } + + public void setPageSize(int pageSize) { + this.pageSize = pageSize; + } + + public String getAction() { + return action; + } + + public void setAction(String action) { + this.action = action; + } + + public Boolean getPaging() { + return isPaging; + } + + public void setPaging(Boolean paging) { + isPaging = paging; + } + + public List getResultColumn() { + return resultColumn; + } + + public void setResultColumn(List resultColumn) { + this.resultColumn = resultColumn; + } + + public Map getFilter() { + return filter; + } + + public void setFilter(Map filter) { + this.filter = filter; + } +} diff --git a/core/metric/src/main/java/com/webank/wedatasphere/qualitis/dao/RuleMetricDao.java b/core/metric/src/main/java/com/webank/wedatasphere/qualitis/dao/RuleMetricDao.java index 71dee3da..4b48ddbd 100644 --- a/core/metric/src/main/java/com/webank/wedatasphere/qualitis/dao/RuleMetricDao.java +++ b/core/metric/src/main/java/com/webank/wedatasphere/qualitis/dao/RuleMetricDao.java @@ -19,7 +19,9 @@ import com.webank.wedatasphere.qualitis.entity.Department; import com.webank.wedatasphere.qualitis.entity.RuleMetric; import com.webank.wedatasphere.qualitis.entity.User; + import java.util.List; +import java.util.Set; /** * @author allenzhou @@ -27,63 +29,96 @@ public interface RuleMetricDao { /** * Query pageable rule metrics with SYS_ADMIN. + * * @param subSystemName * @param ruleMetricName * @param enCode * @param type * @param available + * @param multiEnvs + * @param devDepartmentId + * @param opsDepartmentId + * @param actionRange + * @param tableDataType + * @param createUser + * @param modifyUser * @param page * @param size * @return */ - List queryAllRuleMetrics(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, int page, - int size); + List queryAllRuleMetrics(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, + Boolean multiEnvs, String devDepartmentId, String opsDepartmentId, Set actionRange, String tableDataType, String createUser, String modifyUser, int page, int size); /** * Count query rule metrics with SYS_ADMIN. + * * @param subSystemName * @param ruleMetricName * @param enCode * @param type * @param available + * @param multiEnvs + * @param devDepartmentId + * @param opsDepartmentId + * @param actionRange + * @param tableDataType + * @param createUser + * @param modifyUser * @return */ - long countQueryAllRuleMetrics(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available); + long countQueryAllRuleMetrics(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, + Boolean multiEnvs, String devDepartmentId, String opsDepartmentId, Set actionRange, String tableDataType, String createUser, String modifyUser); /** * Query pageable rule metrics with different characters(DEPARTMENT_ADMIN, PROJECTOR). - * @param level - * @param departmentList - * @param user + * * @param subSystemName * @param ruleMetricName * @param enCode * @param type + * @param requestAvailable * @param available + * @param multiEnvs + * @param tableDataType + * @param dataVisibilityDeptList + * @param createUser + * @param devDepartmentId + * @param opsDepartmentId + * @param actionRange + * @param buildUser + * @param modifyUser * @param page * @param size * @return */ - List queryRuleMetrics(Integer level, List departmentList, - User user, String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, int page, int size); + List queryRuleMetrics(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean requestAvailable, Boolean available, + Boolean multiEnvs, String tableDataType, List dataVisibilityDeptList, String createUser, String devDepartmentId, String opsDepartmentId, Set actionRange, String buildUser, String modifyUser, int page, int size); /** * Count query rule metrics with different characters(DEPARTMENT_ADMIN, PROJECTOR). - * @param level - * @param departmentList - * @param user + * * @param subSystemName * @param ruleMetricName * @param enCode * @param type * @param available + * @param multiEnvs + * @param tableDataType + * @param dataVisibilityDeptList + * @param createUser + * @param devDepartmentId + * @param opsDepartmentId + * @param actionRange + * @param buildUser + * @param modifyUser * @return */ - long countQueryRuleMetrics(Integer level, List departmentList, - User user, String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available); + long countQueryRuleMetrics(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, Boolean multiEnvs + , String tableDataType, List dataVisibilityDeptList, String createUser, String devDepartmentId, String opsDepartmentId, Set actionRange, String buildUser, String modifyUser); /** * Query rule metrics with name. + * * @param level * @param departmentList * @param user @@ -96,6 +131,7 @@ long countQueryRuleMetrics(Integer level, List departmentList, /** * Count of querying rule metrics with name. + * * @param level * @param departmentList * @param user @@ -104,8 +140,33 @@ long countQueryRuleMetrics(Integer level, List departmentList, */ long countWithRuleMetricName(Integer level, List departmentList, User user, String name); + /** + * Query rule metrics by sub-system-id. + * + * @param level + * @param departmentList + * @param user + * @param subSystemId + * @param page + * @param size + * @return + */ + List findBySubSystemId(Integer level, List departmentList, User user, long subSystemId, int page, int size); + + /** + * Count of querying rule metrics by sub-system-id. + * + * @param level + * @param departmentList + * @param user + * @param subSystemId + * @return + */ + long countBySubSystemId(Integer level, List departmentList, User user, long subSystemId); + /** * Find all rule metrics. + * * @param page * @param size * @return @@ -114,12 +175,14 @@ long countQueryRuleMetrics(Integer level, List departmentList, /** * Count all rule metrics. + * * @return */ long countAllRuleMetrics(); /** * Find pageable rule metrics with different characters(SYS_ADMIN, DEPARTMENT_ADMIN, PROJECTOR). + * * @param level * @param departmentList * @param user @@ -131,6 +194,7 @@ long countQueryRuleMetrics(Integer level, List departmentList, /** * Count all rule metrics with different characters(SYS_ADMIN, DEPARTMENT_ADMIN, PROJECTOR). + * * @param level * @param departmentList * @param user @@ -138,8 +202,99 @@ long countQueryRuleMetrics(Integer level, List departmentList, */ long countRuleMetrics(Integer level, List departmentList, User user); + /** + * Find not used rule metric + * + * @param level + * @param departmentList + * @param user + * @param createUser + * @param tableDataType + * @param dataVisibilityDeptList + * @param page + * @param size + * @return + */ + List findNotUsed(Integer level, List departmentList, + User user, String createUser, String tableDataType, List dataVisibilityDeptList, int page, int size); + + /** + * Count not used rule metric + * + * @param level + * @param departmentList + * @param user + * @param createUser + * @param tableDataType + * @param dataVisibilityDeptList + * @return + */ + long countNotUsed(Integer level, List departmentList, User user, String createUser, String tableDataType, List dataVisibilityDeptList); + + /** + * Find not used rule metric + * + * @param page + * @param size + * @return + */ + List findAllNotUsed(int page, int size); + + /** + * Count not used rule metric + * + * @return + */ + long countAllNotUsed(); + + /** + * Find used rule metric + * + * @param level + * @param departmentList + * @param user + * @param createUser + * @param tableDataType + * @param dataVisibilityDeptList + * @param page + * @param size + * @return + */ + List findUsed(Integer level, List departmentList, + User user, String createUser, String tableDataType, List dataVisibilityDeptList, int page, int size); + + /** + * Count used rule metric + * + * @param level + * @param departmentList + * @param user + * @param createUser + * @param tableDataType + * @param dataVisibilityDeptList + * @return + */ + long countUsed(Integer level, List departmentList, User user, String createUser, String tableDataType, List dataVisibilityDeptList); + + /** + * Find used rule metric + * + * @param page + * @param size + * @return + */ + List findAllUsed(int page, int size); + + /** + * Count used rule metric + * + * @return + */ + long countAllUsed(); + /** * Add + * * @param ruleMetric * @return */ @@ -147,6 +302,7 @@ long countQueryRuleMetrics(Integer level, List departmentList, /** * Modify + * * @param ruleMetric * @return */ @@ -154,12 +310,14 @@ long countQueryRuleMetrics(Integer level, List departmentList, /** * Delete + * * @param ruleMetric */ void delete(RuleMetric ruleMetric); /** * Find by id. + * * @param id * @return */ @@ -167,6 +325,7 @@ long countQueryRuleMetrics(Integer level, List departmentList, /** * Find by en code. + * * @param name * @return */ @@ -174,6 +333,7 @@ long countQueryRuleMetrics(Integer level, List departmentList, /** * Find by IDs. + * * @param ids * @return */ @@ -181,6 +341,7 @@ long countQueryRuleMetrics(Integer level, List departmentList, /** * Find by name. + * * @param name * @return */ diff --git a/core/metric/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/RuleMetricDaoImpl.java b/core/metric/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/RuleMetricDaoImpl.java index f16d1561..73e1edbc 100644 --- a/core/metric/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/RuleMetricDaoImpl.java +++ b/core/metric/src/main/java/com/webank/wedatasphere/qualitis/dao/impl/RuleMetricDaoImpl.java @@ -5,13 +5,17 @@ import com.webank.wedatasphere.qualitis.entity.Department; import com.webank.wedatasphere.qualitis.entity.RuleMetric; import com.webank.wedatasphere.qualitis.entity.User; -import java.util.List; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.data.domain.PageRequest; import org.springframework.data.domain.Pageable; import org.springframework.data.domain.Sort; +import org.springframework.data.domain.Sort.Direction; import org.springframework.stereotype.Repository; +import java.util.List; +import java.util.Optional; +import java.util.Set; + /** * @author allenzhou */ @@ -21,61 +25,75 @@ public class RuleMetricDaoImpl implements RuleMetricDao { private RuleMetricRepository ruleMetricRepository; @Override - public List queryAllRuleMetrics(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, int page, int size) { - Sort sort = new Sort(Sort.Direction.ASC, "id"); + public List queryAllRuleMetrics(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, Boolean multiEnvs, String devDepartmentId, String opsDepartmentId, Set actionRange, String tableDataType, String createUser, String modifyUser, int page, int size) { + Sort sort = Sort.by(Direction.DESC, "id"); Pageable pageable = PageRequest.of(page, size, sort); - return ruleMetricRepository.queryAll(subSystemName, ruleMetricName, enCode, type, available, pageable).getContent(); + return ruleMetricRepository.queryAll(subSystemName, ruleMetricName, enCode, type, available, multiEnvs, devDepartmentId, opsDepartmentId, actionRange, tableDataType, createUser, modifyUser, pageable).getContent(); } @Override - public long countQueryAllRuleMetrics(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available) { - return ruleMetricRepository.countQueryAll(subSystemName, ruleMetricName, enCode, type, available); + public long countQueryAllRuleMetrics(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, Boolean multiEnvs, String devDepartmentId, String opsDepartmentId, Set actionRange, String tableDataType, String createUser, String modifyUser) { + return ruleMetricRepository.countQueryAll(subSystemName, ruleMetricName, enCode, type, available, multiEnvs, devDepartmentId, opsDepartmentId, actionRange, tableDataType, createUser, modifyUser); } @Override - public List queryRuleMetrics(Integer level, List departmentList, User user, String subSystemName, String ruleMetricName - , String enCode, Integer type, Boolean available, int page, int size) { - Sort sort = new Sort(Sort.Direction.ASC, "id"); + public List queryRuleMetrics(String subSystemName, String ruleMetricName + , String enCode, Integer type, Boolean requestAvailable, Boolean available, Boolean multiEnvs, String tableDataType, List dataVisibilityDeptList, String createUser, String devDepartmentId, String opsDepartmentId, Set actionRange, String buildUser, String modifyUser, int page, int size) { + Sort sort = Sort.by(Sort.Direction.DESC, "id"); Pageable pageable = PageRequest.of(page, size, sort); - return ruleMetricRepository.queryRuleMetrics(level, departmentList, user, subSystemName, ruleMetricName, enCode, type, available, pageable).getContent(); + return ruleMetricRepository.queryRuleMetrics(subSystemName, ruleMetricName, enCode, type, available, multiEnvs, tableDataType, dataVisibilityDeptList, createUser, devDepartmentId, opsDepartmentId, actionRange, buildUser, modifyUser, pageable).getContent(); } @Override - public long countQueryRuleMetrics(Integer level, List departmentList, User user, String subSystemName, String ruleMetricName, String enCode - , Integer type, Boolean available) { - return ruleMetricRepository.countQueryRuleMetrics(level, departmentList, user, subSystemName, ruleMetricName, enCode, type, available); + public long countQueryRuleMetrics(String subSystemName, String ruleMetricName, + String enCode, Integer type, Boolean available, Boolean multiEnvs, String tableDataType, List dataVisibilityDeptList, String createUser, String devDepartmentId, String opsDepartmentId, Set actionRange, String buildUser, String modifyUser) { + return ruleMetricRepository.countQueryRuleMetrics(subSystemName, ruleMetricName, enCode, type, available, multiEnvs, tableDataType, dataVisibilityDeptList, createUser, devDepartmentId, opsDepartmentId, actionRange, buildUser, modifyUser); } @Override public List findWithRuleMetricName(Integer level, List departmentList, - User user, String name, int page, int size) { - Sort sort = new Sort(Sort.Direction.ASC, "id"); + User user, String name, int page, int size) { + Sort sort = Sort.by(Sort.Direction.DESC, "id"); Pageable pageable = PageRequest.of(page, size, sort); return ruleMetricRepository.findWithRuleMetricName(level, departmentList, user, "%".concat(name).concat("%"), pageable).getContent(); } @Override public long countWithRuleMetricName(Integer level, List departmentList, - User user, String name) { + User user, String name) { return ruleMetricRepository.countWithRuleMetricName(level, departmentList, user, "%".concat(name).concat("%")); } + @Override + public List findBySubSystemId(Integer level, List departmentList, + User user, long subSystemId, int page, int size) { + Sort sort = Sort.by(Sort.Direction.DESC, "id"); + Pageable pageable = PageRequest.of(page, size, sort); + return ruleMetricRepository.findBySubSystemId(level, departmentList, user, subSystemId, pageable).getContent(); + } + + @Override + public long countBySubSystemId(Integer level, List departmentList, + User user, long subSystemId) { + return ruleMetricRepository.countBySubSystemId(level, departmentList, user, subSystemId); + } + @Override public List findAllRuleMetrics(int page, int size) { - Sort sort = new Sort(Sort.Direction.ASC, "id"); + Sort sort = Sort.by(Sort.Direction.DESC, "id"); Pageable pageable = PageRequest.of(page, size, sort); return ruleMetricRepository.findAll(pageable).getContent(); } @Override public long countAllRuleMetrics() { - return ruleMetricRepository.count(); + return ruleMetricRepository.count(); } @Override public List findRuleMetrics(Integer level, List departmentList, - User user, int page, int size) { - Sort sort = new Sort(Sort.Direction.ASC, "id"); + User user, int page, int size) { + Sort sort = Sort.by(Sort.Direction.DESC, "id"); Pageable pageable = PageRequest.of(page, size, sort); return ruleMetricRepository.findRuleMetrics(level, departmentList, user, pageable).getContent(); } @@ -85,38 +103,92 @@ public long countRuleMetrics(Integer level, List departmentList, Use return ruleMetricRepository.countRuleMetrics(level, departmentList, user); } + @Override + public List findNotUsed(Integer level, List departmentList, + User user, String createUser, String tableDataType, List dataVisibilityDeptList, int page, int size) { + Sort sort = Sort.by(Sort.Direction.DESC, "id"); + Pageable pageable = PageRequest.of(page, size, sort); + return ruleMetricRepository.findNotUsed(createUser, tableDataType, dataVisibilityDeptList, pageable).getContent(); + } + + @Override + public long countNotUsed(Integer level, List departmentList, User user, String createUser, String tableDataType, List dataVisibilityDeptList) { + return ruleMetricRepository.countNotUsed(createUser, tableDataType, dataVisibilityDeptList); + } + + @Override + public List findAllNotUsed(int page, int size) { + Sort sort = Sort.by(Sort.Direction.DESC, "id"); + Pageable pageable = PageRequest.of(page, size, sort); + return ruleMetricRepository.findAllNotUsed(pageable).getContent(); + } + + @Override + public long countAllNotUsed() { + return ruleMetricRepository.countAllNotUsed(); + } + + @Override + public List findUsed(Integer level, List departmentList, + User user, String createUser, String tableDataType, List dataVisibilityDeptList, int page, int size) { + Sort sort = Sort.by(Sort.Direction.DESC, "id"); + Pageable pageable = PageRequest.of(page, size, sort); + return ruleMetricRepository.findUsed(createUser, tableDataType, dataVisibilityDeptList, pageable).getContent(); + } + + @Override + public long countUsed(Integer level, List departmentList, User user, String createUser, String tableDataType, List dataVisibilityDeptList) { + return ruleMetricRepository.countUsed(createUser, tableDataType, dataVisibilityDeptList); + } + + @Override + public List findAllUsed(int page, int size) { + Sort sort = Sort.by(Sort.Direction.DESC, "id"); + Pageable pageable = PageRequest.of(page, size, sort); + return ruleMetricRepository.findAllUsed(pageable).getContent(); + } + + @Override + public long countAllUsed() { + return ruleMetricRepository.countAllUsed(); + } + @Override public RuleMetric add(RuleMetric ruleMetric) { - return ruleMetricRepository.save(ruleMetric); + return ruleMetricRepository.save(ruleMetric); } @Override public RuleMetric modify(RuleMetric ruleMetric) { - return ruleMetricRepository.save(ruleMetric); + return ruleMetricRepository.save(ruleMetric); } @Override public void delete(RuleMetric ruleMetric) { - ruleMetricRepository.delete(ruleMetric); + ruleMetricRepository.delete(ruleMetric); } @Override public RuleMetric findById(long id) { - return ruleMetricRepository.findById(id).get(); + Optional optional = ruleMetricRepository.findById(id); + if (optional.isPresent()) { + return optional.get(); + } + return null; } @Override public RuleMetric findByEnCode(String enCode) { - return ruleMetricRepository.findByEnCode(enCode); + return ruleMetricRepository.findByEnCode(enCode); } @Override public List findByIds(List ids) { - return ruleMetricRepository.findAllById(ids); + return ruleMetricRepository.findAllById(ids); } @Override public RuleMetric findByName(String name) { - return ruleMetricRepository.findByName(name); + return ruleMetricRepository.findByName(name); } } diff --git a/core/metric/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/RuleMetricRepository.java b/core/metric/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/RuleMetricRepository.java index e30d2c78..d1e40a34 100644 --- a/core/metric/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/RuleMetricRepository.java +++ b/core/metric/src/main/java/com/webank/wedatasphere/qualitis/dao/repository/RuleMetricRepository.java @@ -19,12 +19,14 @@ import com.webank.wedatasphere.qualitis.entity.Department; import com.webank.wedatasphere.qualitis.entity.RuleMetric; import com.webank.wedatasphere.qualitis.entity.User; -import java.util.List; import org.springframework.data.domain.Page; import org.springframework.data.domain.Pageable; import org.springframework.data.jpa.repository.JpaRepository; import org.springframework.data.jpa.repository.Query; +import java.util.List; +import java.util.Set; + /** * @author allenzhou */ @@ -36,54 +38,110 @@ public interface RuleMetricRepository extends JpaRepository { * @param enCode * @param type * @param available + * @param multiEnvs + * @param devDepartmentId + * @param opsDepartmentId + * @param actionRange + * @param tableDataType + * @param createUser + * @param modifyUser * @param pageable * @return */ - @Query(value = "SELECT qrm FROM RuleMetric qrm where (LENGTH(?1) = 0 OR qrm.subSystemName = ?1) AND (LENGTH(?2) = 0 OR qrm.name LIKE ?2) AND (LENGTH(?3) = 0 or qrm.enCode = ?3) AND (?4 is null or qrm.type = ?4) AND (?5 is null or qrm.available = ?5)") - Page queryAll(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, Pageable pageable); + @Query(value = "SELECT qrm FROM RuleMetric qrm where (LENGTH(?1) = 0 OR qrm.subSystemName = ?1) AND (LENGTH(?2) = 0 OR qrm.name LIKE ?2) AND (LENGTH(?3) = 0 or qrm.enCode = ?3) " + + "AND (?4 is null or qrm.type = ?4) AND (?5 is null or qrm.available = ?5) AND (?6 is null or qrm.multiEnv = ?6) " + + "AND (LENGTH(?7) = 0 or qrm.devDepartmentId = ?7) " + + "AND (LENGTH(?8) = 0 or qrm.opsDepartmentId = ?8) " + + "AND (LENGTH(?11) = 0 or qrm.createUser = ?11) " + + "AND (LENGTH(?12) = 0 or qrm.modifyUser = ?12) " + + "AND (?9 is null or EXISTS (SELECT dv.tableDataId FROM DataVisibility dv WHERE dv.tableDataType = ?10 AND (dv.departmentSubId in (?9) or dv.departmentSubId = 0) AND qrm = dv.tableDataId)) ") + Page queryAll(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, Boolean multiEnvs, String devDepartmentId, String opsDepartmentId, Set actionRange, String tableDataType,String createUser,String modifyUser, Pageable pageable); /** * Count query rule metrics with SYS_ADMIN. + * * @param subSystemName * @param ruleMetricName * @param enCode * @param type * @param available + * @param multiEnvs + * @param devDepartmentId + * @param opsDepartmentId + * @param actionRange + * @param tableDataType + * @param createUser + * @param modifyUser * @return */ - @Query(value = "SELECT count(qrm.id) FROM RuleMetric qrm where (LENGTH(?1) = 0 OR qrm.subSystemName = ?1) AND (LENGTH(?2) = 0 OR qrm.name LIKE ?2) AND (LENGTH(?3) = 0 or qrm.enCode = ?3) AND (?4 is null or qrm.type = ?4) AND (?5 is null or qrm.available = ?5)") - long countQueryAll(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available); + @Query(value = "SELECT count(qrm.id) FROM RuleMetric qrm where (LENGTH(?1) = 0 OR qrm.subSystemName = ?1) AND (LENGTH(?2) = 0 OR qrm.name LIKE ?2) AND (LENGTH(?3) = 0 or qrm.enCode = ?3) " + + "AND (?4 is null or qrm.type = ?4) AND (?5 is null or qrm.available = ?5) AND (?6 is null or qrm.multiEnv = ?6)" + + "AND (LENGTH(?7) = 0 or qrm.devDepartmentId = ?7) " + + "AND (LENGTH(?8) = 0 or qrm.opsDepartmentId = ?8) " + + "AND (LENGTH(?11) = 0 or qrm.createUser = ?11) " + + "AND (LENGTH(?12) = 0 or qrm.modifyUser = ?12) " + + "AND (?9 is null or EXISTS (SELECT dv.tableDataId FROM DataVisibility dv WHERE dv.tableDataType = ?10 AND (dv.departmentSubId in (?9) or dv.departmentSubId = 0) AND qrm = dv.tableDataId)) ") + long countQueryAll(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, Boolean multiEnvs, String devDepartmentId, String opsDepartmentId, Set actionRange, String tableDataType,String createUser,String modifyUser); /** * Query pageable rule metrics with different characters(DEPARTMENT_ADMIN, PROJECTOR). - * @param level - * @param departmentList - * @param user * @param subSystemName * @param ruleMetricName * @param enCode * @param type * @param available + * @param multiEnvs + * @param tableDataType + * @param dataVisibilityDeptList + * @param createUser + * @param devDepartmentId + * @param opsDepartmentId + * @param actionRange + * @param buildUser + * @param modifyUser * @param pageable * @return */ - @Query(value = "SELECT qrm FROM RuleMetric qrm where (qrm.level = ?1 OR (qrm IN (SELECT qrmdu.ruleMetric FROM RuleMetricDepartmentUser qrmdu where qrmdu.department in (?2) OR ?3 is null OR qrmdu.user = ?3))) AND (?4 = '' OR qrm.subSystemName = ?4) AND (?5 = '' OR qrm.name LIKE ?5) AND (?6 = '' OR qrm.enCode = ?6) AND (?7 is null OR qrm.type = ?7) AND (?8 is null or qrm.available = ?8)") - Page queryRuleMetrics(Integer level, List departmentList, User user, String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, Pageable pageable); + @Query(value = "SELECT qrm FROM RuleMetric qrm where " + + " ((?1 = '' OR qrm.subSystemName = ?1) AND (?2 = '' OR qrm.name LIKE ?2) AND (?3 = '' OR qrm.enCode = ?3) AND (?4 is null OR qrm.type = ?4) AND (?5 is null or qrm.available = ?5) AND (?6 is null or qrm.multiEnv = ?6))" + + " AND (?10 = '' or qrm.devDepartmentId = ?10) " + + " AND (?11 = '' or qrm.opsDepartmentId = ?11) " + + " AND (?13 = '' or qrm.createUser = ?13) " + + " AND (?14 = '' or qrm.modifyUser = ?14) " + + " AND (?12 is null or EXISTS (SELECT q.tableDataId FROM DataVisibility q WHERE q.tableDataType = ?7 AND (q.departmentSubId in (?12) or q.departmentSubId = 0) AND qrm = q.tableDataId)) " + + " AND (qrm.createUser = ?9 OR qrm.devDepartmentId in (?8) OR qrm.opsDepartmentId in (?8) OR EXISTS (SELECT dv.tableDataId FROM DataVisibility dv WHERE dv.tableDataType = ?7 AND (dv.departmentSubId in (?8) or dv.departmentSubId = 0) AND qrm = dv.tableDataId))" + ) + Page queryRuleMetrics(String subSystemName, String ruleMetricName, + String enCode, Integer type, Boolean available, Boolean multiEnvs, String tableDataType, List dataVisibilityDeptList, String createUser, String devDepartmentId, String opsDepartmentId, Set actionRange,String buildUser,String modifyUser, Pageable pageable); /** * Count query rule metrics with different characters(DEPARTMENT_ADMIN, PROJECTOR). - * @param level - * @param departmentList - * @param user + * * @param subSystemName * @param ruleMetricName * @param enCode * @param type * @param available + * @param multiEnvs + * @param tableDataType + * @param dataVisibilityDeptList + * @param createUser + * @param devDepartmentId + * @param opsDepartmentId + * @param actionRange + * @param buildUser + * @param modifyUser * @return */ - @Query(value = "SELECT count(qrm.id) FROM RuleMetric qrm where (qrm.level = ?1 OR (qrm IN (SELECT qrmdu.ruleMetric FROM RuleMetricDepartmentUser qrmdu where qrmdu.department in (?2) OR ?3 is null OR qrmdu.user = ?3))) AND (?4 = '' OR qrm.subSystemName = ?4) AND (?5 = '' OR qrm.name LIKE ?5) AND (?6 = '' OR qrm.enCode = ?6) AND (?7 is null OR qrm.type = ?7) AND (?8 is null or qrm.available = ?8)") - long countQueryRuleMetrics(Integer level, List departmentList, User user, String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available); + @Query(value = "SELECT count(qrm.id) FROM RuleMetric qrm where " + + " ((?1 = '' OR qrm.subSystemName = ?1) AND (?2 = '' OR qrm.name LIKE ?2) AND (?3 = '' OR qrm.enCode = ?3) AND (?4 is null OR qrm.type = ?4) AND (?5 is null or qrm.available = ?5) AND (?6 is null or qrm.multiEnv = ?6))" + + " AND (?10 = '' or qrm.devDepartmentId = ?10) " + + " AND (?11 = '' or qrm.opsDepartmentId = ?11) " + + " AND (?13 = '' or qrm.createUser = ?13) " + + " AND (?14 = '' or qrm.modifyUser = ?14) " + + " AND (?12 is null or EXISTS (SELECT q.tableDataId FROM DataVisibility q WHERE q.tableDataType = ?7 AND (q.departmentSubId in (?12) or q.departmentSubId = 0) AND qrm = q.tableDataId)) " + + " AND (qrm.createUser = ?9 OR qrm.devDepartmentId in (?8) OR qrm.opsDepartmentId in (?8) OR EXISTS (SELECT dv.tableDataId FROM DataVisibility dv WHERE dv.tableDataType = ?7 AND (dv.departmentSubId in (?8) or dv.departmentSubId = 0) AND qrm = dv.tableDataId))") + long countQueryRuleMetrics(String subSystemName, String ruleMetricName, String enCode, Integer type, Boolean available, Boolean multiEnvs, String tableDataType, List dataVisibilityDeptList, String createUser, String devDepartmentId, String opsDepartmentId, Set actionRange,String buildUser,String modifyUser); /** * Find pageable rule metrics with different characters(DEPARTMENT_ADMIN, PROJECTOR). @@ -93,7 +151,7 @@ public interface RuleMetricRepository extends JpaRepository { * @param pageable * @return */ - @Query(value = "SELECT qrm FROM RuleMetric qrm where qrm.level = ?1 OR (qrm IN (SELECT qrmdu.ruleMetric FROM RuleMetricDepartmentUser qrmdu where qrmdu.department in (?2) OR qrmdu.user = ?3))") + @Query(value = "SELECT qrm FROM RuleMetric qrm where (?1 is null or qrm.level = ?1) OR (qrm IN (SELECT qrmdu.ruleMetric FROM RuleMetricDepartmentUser qrmdu where qrmdu.department in (?2) OR qrmdu.user = ?3))") Page findRuleMetrics(Integer level, List departmentList, User user, Pageable pageable); /** @@ -129,6 +187,29 @@ public interface RuleMetricRepository extends JpaRepository { @Query(value = "SELECT count(qrm.id) FROM RuleMetric qrm where qrm.level = ?1 AND qrm.name LIKE ?4 OR (qrm IN (SELECT qrmdu.ruleMetric FROM RuleMetricDepartmentUser qrmdu where qrmdu.department in (?2) OR qrmdu.user IN (?3)))") long countWithRuleMetricName(Integer level, List departmentList, User user, String name); + /** + * Query rule metrics by sub-system-id. + * @param level + * @param departmentList + * @param user + * @param subSystemId + * @param pageable + * @return + */ + @Query(value = "SELECT qrm FROM RuleMetric qrm where qrm.level = ?1 AND qrm.subSystemId = ?4 OR (qrm IN (SELECT qrmdu.ruleMetric FROM RuleMetricDepartmentUser qrmdu where qrmdu.department in (?2) OR qrmdu.user IN (?3)))") + Page findBySubSystemId(Integer level, List departmentList, User user, long subSystemId, Pageable pageable); + + /** + * Count of querying rule metrics by sub-system-id. + * @param level + * @param departmentList + * @param user + * @param subSystemId + * @return + */ + @Query(value = "SELECT count(qrm.id) FROM RuleMetric qrm where qrm.level = ?1 AND qrm.subSystemId = ?4 OR (qrm IN (SELECT qrmdu.ruleMetric FROM RuleMetricDepartmentUser qrmdu where qrmdu.department in (?2) OR qrmdu.user IN (?3)))") + long countBySubSystemId(Integer level, List departmentList, User user, long subSystemId); + /** * Find by name. * @param name @@ -143,4 +224,80 @@ public interface RuleMetricRepository extends JpaRepository { */ @Query(value = "SELECT qrm FROM RuleMetric qrm where qrm.enCode = ?1") RuleMetric findByEnCode(String enCode); + + /** + * Find not used rule metric + * @param createUser + * @param tableDataType + * @param dataVisibilityDeptList + * @param pageable + * @return + */ + @Query(value = "SELECT qrm FROM RuleMetric qrm where qrm.available = 1 AND ((qrm.createUser = ?1 OR qrm.devDepartmentId in (?3) OR qrm.opsDepartmentId in (?3) OR EXISTS (SELECT dv.tableDataId FROM DataVisibility dv WHERE dv.tableDataType = ?2 AND dv.departmentSubId in (?3) AND qrm = dv.tableDataId)) " + + " AND NOT EXISTS (SELECT id FROM AlarmConfig qrac where qrm.id = qrac.ruleMetric.id))") + Page findNotUsed(String createUser, String tableDataType, List dataVisibilityDeptList, Pageable pageable); + + /** + * Count not used rule metric + * @param createUser + * @param tableDataType + * @param dataVisibilityDeptList + * @return + */ + @Query(value = "SELECT qrm FROM RuleMetric qrm where qrm.available = 1 AND ((qrm.createUser = ?1 OR qrm.devDepartmentId in (?3) OR qrm.opsDepartmentId in (?3) OR EXISTS (SELECT dv.tableDataId FROM DataVisibility dv WHERE dv.tableDataType = ?2 AND dv.departmentSubId in (?3) AND qrm = dv.tableDataId)) " + + " AND NOT EXISTS (SELECT id FROM AlarmConfig qrac where qrm.id = qrac.ruleMetric.id))") + long countNotUsed(String createUser, String tableDataType, List dataVisibilityDeptList); + + /** + * Find not used rule metric + * @param pageable + * @return + */ + @Query(value = "SELECT qrm FROM RuleMetric qrm WHERE qrm.available = 1 AND (NOT EXISTS (SELECT qrac.id FROM AlarmConfig qrac where qrm.id = qrac.ruleMetric.id))") + Page findAllNotUsed(Pageable pageable); + + /** + * Count not used rule metric + * @return + */ + @Query(value = "SELECT count(qrm.id) FROM RuleMetric qrm where qrm.available = 1 AND (NOT EXISTS (SELECT qrac.id FROM AlarmConfig qrac where qrm.id = qrac.ruleMetric.id))") + long countAllNotUsed(); + + /** + * Find used rule metric + * @param createUser + * @param tableDataType + * @param dataVisibilityDeptList + * @param pageable + * @return + */ + @Query(value = "SELECT qrm FROM RuleMetric qrm where qrm.available = 1 AND (qrm.createUser = ?1 OR qrm.devDepartmentId in (?3) OR qrm.opsDepartmentId in (?3) OR EXISTS (SELECT dv.tableDataId FROM DataVisibility dv WHERE dv.tableDataType = ?2 AND dv.departmentSubId in (?3) AND qrm = dv.tableDataId))" + + " AND EXISTS (SELECT id FROM AlarmConfig qrac where qrm.id = qrac.ruleMetric.id)") + Page findUsed(String createUser, String tableDataType, List dataVisibilityDeptList, Pageable pageable); + + /** + * Count used rule metric + * @param createUser + * @param tableDataType + * @param dataVisibilityDeptList + * @return + */ + @Query(value = "SELECT count(qrm.id) FROM RuleMetric qrm where qrm.available = 1 AND (qrm.createUser = ?1 OR qrm.devDepartmentId in (?3) OR qrm.opsDepartmentId in (?3) OR EXISTS (SELECT dv.tableDataId FROM DataVisibility dv WHERE dv.tableDataType = ?2 AND dv.departmentSubId in (?3) AND qrm = dv.tableDataId))" + + " AND EXISTS (SELECT id FROM AlarmConfig qrac where qrm.id = qrac.ruleMetric.id)") + long countUsed(String createUser, String tableDataType, List dataVisibilityDeptList); + + /** + * Find used rule metric + * @param pageable + * @return + */ + @Query(value = "SELECT qrm FROM RuleMetric qrm WHERE qrm.available = 1 AND (EXISTS (SELECT qrac.id FROM AlarmConfig qrac where qrm.id = qrac.ruleMetric.id))") + Page findAllUsed(Pageable pageable); + + /** + * Count used rule metric + * @return + */ + @Query(value = "SELECT count(qrm.id) FROM RuleMetric qrm where qrm.available = 1 AND (EXISTS (SELECT qrac.id FROM AlarmConfig qrac where qrm.id = qrac.ruleMetric.id))") + long countAllUsed(); } diff --git a/core/metric/src/main/java/com/webank/wedatasphere/qualitis/entity/RuleMetric.java b/core/metric/src/main/java/com/webank/wedatasphere/qualitis/entity/RuleMetric.java index 5a48ecc9..f0b1713c 100644 --- a/core/metric/src/main/java/com/webank/wedatasphere/qualitis/entity/RuleMetric.java +++ b/core/metric/src/main/java/com/webank/wedatasphere/qualitis/entity/RuleMetric.java @@ -22,39 +22,46 @@ public class RuleMetric { private String name; @Column(name = "cn_name") private String cnName; - @Column(name = "metric_desc") private String metricDesc; + @Column(name = "sub_system_id") + private Integer subSystemId; @Column(name = "sub_system_name") private String subSystemName; @Column(name = "full_cn_name") private String fullCnName; + @Column(name = "product_id") + private String productId; @Column(name = "product_name") private String productName; + @Column(name = "create_user", length = 50) + private String createUser; + @Column(name = "create_time", length = 25) + private String createTime; + @Column(name = "modify_user", length = 50) + private String modifyUser; + @Column(name = "modify_time", length = 25) + private String modifyTime; + + @Column(name = "department_code") + private String departmentCode; @Column(name = "department_name") private String departmentName; - - @Column(name = "dev_department_name") private String devDepartmentName; @Column(name = "ops_department_name") private String opsDepartmentName; + @Column(name = "dev_department_id") + private Long devDepartmentId; + @Column(name = "ops_department_id") + private Long opsDepartmentId; @Column(name = "metric_level") private Integer level; - @Column(name = "create_user", length = 50) - private String createUser; - @Column(name = "create_time", length = 25) - private String createTime; - @Column(name = "modify_user", length = 50) - private String modifyUser; - @Column(name = "modify_time", length = 25) - private String modifyTime; - @Column(name = "type") private Integer type; @Column(name = "en_code") @@ -70,22 +77,28 @@ public class RuleMetric { @Column(name = "buss_custom") private String bussCustom; - public RuleMetric(String name, String cnName, String desc, String subSystemName, String fullCnName, String productName, String departmentName - , String devDepartmentName, String opsDepartmentName, Integer type, String enCode, Integer frequency, Boolean available, Integer bussCode - , String bussCustom) { + @Column(name = "multi_env") + private Boolean multiEnv; + + public RuleMetric(String name, String cnName, String desc, Integer subSystemId, String subSystemName, String fullCnName, String productId + , String productName, String departmentCode, String departmentName, String devDepartmentName, String opsDepartmentName, Integer type + , String enCode, Integer frequency, Boolean available, Integer bussCode, String bussCustom, Boolean multiEnv) { this.name = name; this.cnName = cnName; this.metricDesc = desc; this.bussCode = bussCode; if (RuleMetricBussCodeEnum.SUBSYSTEM.getCode().equals(bussCode)) { + this.subSystemId = subSystemId; this.subSystemName = subSystemName; this.fullCnName = fullCnName; } else if (RuleMetricBussCodeEnum.PRODUCT.getCode().equals(bussCode)) { + this.productId = productId; this.productName = productName; } else if (RuleMetricBussCodeEnum.CUSTOM.getCode().equals(bussCode)) { this.bussCustom = bussCustom; } + this.departmentCode = departmentCode; this.departmentName = departmentName; this.devDepartmentName = devDepartmentName; this.opsDepartmentName = opsDepartmentName; @@ -94,6 +107,7 @@ public RuleMetric(String name, String cnName, String desc, String subSystemName, this.enCode = enCode; this.frequency = frequency; this.available = available; + this.multiEnv = multiEnv; } public RuleMetric() { @@ -108,6 +122,22 @@ public void setId(Long id) { this.id = id; } + public Long getDevDepartmentId() { + return devDepartmentId; + } + + public void setDevDepartmentId(Long devDepartmentId) { + this.devDepartmentId = devDepartmentId; + } + + public Long getOpsDepartmentId() { + return opsDepartmentId; + } + + public void setOpsDepartmentId(Long opsDepartmentId) { + this.opsDepartmentId = opsDepartmentId; + } + public String getName() { return name; } @@ -132,6 +162,13 @@ public void setMetricDesc(String metricDesc) { this.metricDesc = metricDesc; } + public Integer getSubSystemId() { + return subSystemId; + } + + public void setSubSystemId(Integer subSystemId) { + this.subSystemId = subSystemId; + } public String getSubSystemName() { return subSystemName; @@ -149,6 +186,14 @@ public void setFullCnName(String fullCnName) { this.fullCnName = fullCnName; } + public String getProductId() { + return productId; + } + + public void setProductId(String productId) { + this.productId = productId; + } + public String getProductName() { return productName; } @@ -157,6 +202,14 @@ public void setProductName(String productName) { this.productName = productName; } + public String getDepartmentCode() { + return departmentCode; + } + + public void setDepartmentCode(String departmentCode) { + this.departmentCode = departmentCode; + } + public String getDepartmentName() { return departmentName; } @@ -269,6 +322,14 @@ public void setBussCustom(String bussCustom) { this.bussCustom = bussCustom; } + public Boolean getMultiEnv() { + return multiEnv; + } + + public void setMultiEnv(Boolean multiEnv) { + this.multiEnv = multiEnv; + } + @Override public String toString() { return "RuleMetric{" + @@ -276,6 +337,7 @@ public String toString() { ", name='" + name + '\'' + ", cnName='" + cnName + '\'' + ", metricDesc='" + metricDesc + '\'' + + ", departmentCode='" + departmentCode + '\'' + ", departmentName='" + departmentName + '\'' + ", devDepartmentName='" + devDepartmentName + '\'' + ", opsDepartmentName='" + opsDepartmentName + '\'' + diff --git a/core/metric/src/main/java/com/webank/wedatasphere/qualitis/entity/RuleMetricDepartmentUser.java b/core/metric/src/main/java/com/webank/wedatasphere/qualitis/entity/RuleMetricDepartmentUser.java index d1c9b202..fa7b8e2e 100644 --- a/core/metric/src/main/java/com/webank/wedatasphere/qualitis/entity/RuleMetricDepartmentUser.java +++ b/core/metric/src/main/java/com/webank/wedatasphere/qualitis/entity/RuleMetricDepartmentUser.java @@ -1,13 +1,6 @@ package com.webank.wedatasphere.qualitis.entity; -import javax.persistence.Column; -import javax.persistence.Entity; -import javax.persistence.GeneratedValue; -import javax.persistence.GenerationType; -import javax.persistence.Id; -import javax.persistence.ManyToOne; -import javax.persistence.OneToOne; -import javax.persistence.Table; +import javax.persistence.*; /** * @author allenzhou diff --git a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/config/ThreadPoolConfig.java b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/config/ThreadPoolConfig.java index c3142079..6ffb3b48 100644 --- a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/config/ThreadPoolConfig.java +++ b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/config/ThreadPoolConfig.java @@ -26,12 +26,26 @@ public class ThreadPoolConfig { @Value("${timer.thread.size}") private Integer size; + @Value("${timer.thread.async_execution_core_size}") + private Integer executionCorePoolSize; + @Value("${timer.thread.async_execution_max_size}") + private Integer executionMaxPoolSize; + @Value("${timer.check.update_core_size}") + private Integer updateCorePoolSize; + @Value("${timer.check.update_max_size}") + private Integer updateMaxPoolSize; @Value("${timer.check.period}") private Integer period; + @Value("${timer.check.pending_period}") + private Integer pendingPeriod; @Value("${timer.check.update_job_size}") private Integer updateJobSize; @Value("${timer.lock.zk.path}") private String lockZkPath; + @Value("${timer.abnormal_data_record_alarm.cron}") + private String abnormalDataRedordAlarmCron; + @Value("${timer.abnormal_data_record_alarm.cron_enable}") + private Boolean abnormalDataRedordAlarmCronEnable; public ThreadPoolConfig() { // Default Constructor @@ -45,6 +59,38 @@ public void setSize(Integer size) { this.size = size; } + public Integer getExecutionCorePoolSize() { + return executionCorePoolSize; + } + + public void setExecutionCorePoolSize(Integer executionCorePoolSize) { + this.executionCorePoolSize = executionCorePoolSize; + } + + public Integer getUpdateCorePoolSize() { + return updateCorePoolSize; + } + + public void setUpdateCorePoolSize(Integer updateCorePoolSize) { + this.updateCorePoolSize = updateCorePoolSize; + } + + public Integer getUpdateMaxPoolSize() { + return updateMaxPoolSize; + } + + public void setUpdateMaxPoolSize(Integer updateMaxPoolSize) { + this.updateMaxPoolSize = updateMaxPoolSize; + } + + public Integer getExecutionMaxPoolSize() { + return executionMaxPoolSize; + } + + public void setExecutionMaxPoolSize(Integer executionMaxPoolSize) { + this.executionMaxPoolSize = executionMaxPoolSize; + } + public Integer getPeriod() { return period; } @@ -53,6 +99,14 @@ public void setPeriod(Integer period) { this.period = period; } + public Integer getPendingPeriod() { + return pendingPeriod; + } + + public void setPendingPeriod(Integer pendingPeriod) { + this.pendingPeriod = pendingPeriod; + } + public String getLockZkPath() { return lockZkPath; } @@ -68,4 +122,20 @@ public Integer getUpdateJobSize() { public void setUpdateJobSize(Integer updateJobSize) { this.updateJobSize = updateJobSize; } + + public String getAbnormalDataRedordAlarmCron() { + return abnormalDataRedordAlarmCron; + } + + public void setAbnormalDataRedordAlarmCron(String abnormalDataRedordAlarmCron) { + this.abnormalDataRedordAlarmCron = abnormalDataRedordAlarmCron; + } + + public Boolean getAbnormalDataRedordAlarmCronEnable() { + return abnormalDataRedordAlarmCronEnable; + } + + public void setAbnormalDataRedordAlarmCronEnable(Boolean abnormalDataRedordAlarmCronEnable) { + this.abnormalDataRedordAlarmCronEnable = abnormalDataRedordAlarmCronEnable; + } } diff --git a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/ha/HaAbstractServiceCoordinator.java b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/ha/HaAbstractServiceCoordinator.java index 881bf27a..6c3dd239 100644 --- a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/ha/HaAbstractServiceCoordinator.java +++ b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/ha/HaAbstractServiceCoordinator.java @@ -59,9 +59,6 @@ public void release() { lockFlag = false; try { lock.release(); - } catch (IllegalMonitorStateException e) { - LOGGER.error("Failed to release lock of zookeeper."); - LOGGER.error(e.getMessage(), e); } catch (Exception e) { LOGGER.error("Failed to release lock of zookeeper."); LOGGER.error(e.getMessage(), e); diff --git a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/CheckerRunnable.java b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/CheckerRunnable.java index 1f5ee8ac..3f813638 100644 --- a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/CheckerRunnable.java +++ b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/CheckerRunnable.java @@ -20,16 +20,14 @@ import com.webank.wedatasphere.qualitis.constant.ApplicationCommentEnum; import com.webank.wedatasphere.qualitis.constant.ApplicationStatusEnum; import com.webank.wedatasphere.qualitis.constant.TaskStatusEnum; - +import com.webank.wedatasphere.qualitis.dao.ApplicationCommentDao; import com.webank.wedatasphere.qualitis.dao.ApplicationDao; import com.webank.wedatasphere.qualitis.dao.TaskDao; import com.webank.wedatasphere.qualitis.entity.Application; +import com.webank.wedatasphere.qualitis.entity.ApplicationComment; import com.webank.wedatasphere.qualitis.entity.Task; -import com.webank.wedatasphere.qualitis.ha.AbstractServiceCoordinator; -import java.util.concurrent.ArrayBlockingQueue; -import java.util.concurrent.CountDownLatch; -import java.util.concurrent.ThreadPoolExecutor; -import java.util.concurrent.TimeUnit; +import com.webank.wedatasphere.qualitis.util.SpringContextHolder; +import org.apache.commons.collections.CollectionUtils; import org.apache.commons.lang.StringUtils; import org.slf4j.Logger; import org.slf4j.LoggerFactory; @@ -37,17 +35,22 @@ import java.util.ArrayList; import java.util.Arrays; import java.util.List; +import java.util.concurrent.ArrayBlockingQueue; +import java.util.concurrent.CountDownLatch; +import java.util.concurrent.ThreadPoolExecutor; +import java.util.concurrent.TimeUnit; +import java.util.stream.Collectors; /** * @author howeye */ public class CheckerRunnable implements Runnable { + private String ip; private TaskDao taskDao; private int updateJobSize; private IChecker iChecker; private ApplicationDao applicationDao; private static final ThreadPoolExecutor POOL; - private AbstractServiceCoordinator abstractServiceCoordinator; private static final Logger LOGGER = LoggerFactory.getLogger("monitor"); @@ -61,42 +64,39 @@ public class CheckerRunnable implements Runnable { new ThreadPoolExecutor.DiscardPolicy()); } - public CheckerRunnable(ApplicationDao applicationDao, TaskDao taskDao, IChecker iChecker, AbstractServiceCoordinator abstractServiceCoordinator, int updateSize) { + public CheckerRunnable(ApplicationDao applicationDao, TaskDao taskDao, IChecker iChecker, int updateSize, String ip) { this.applicationDao = applicationDao; + this.ip = ip; this.taskDao = taskDao; this.iChecker = iChecker; this.updateJobSize = updateSize; - this.abstractServiceCoordinator = abstractServiceCoordinator; - - abstractServiceCoordinator.init(); } @Override public void run() { try { LOGGER.info("Start to monitor application."); - abstractServiceCoordinator.coordinate(); // Get task that is not finished - List jobs = null; + List jobs; try { jobs = getJobs(); LOGGER.info("Succeed to find applications that are not end. Application: {}", jobs); } catch (Exception e) { - LOGGER.error("Failed to find applications that are not end.", e); + LOGGER.error("Failed to find applications that are not end."); + LOGGER.error(e.getMessage(), e); return; } int total = jobs.size(); int updateThreadSize = total / updateJobSize + 1; CountDownLatch latch = new CountDownLatch(updateThreadSize); - for (int indexThread = 0; total > 0 && indexThread < total;) { + for (int indexThread = 0; total > 0 && indexThread < total; indexThread += updateJobSize) { if (indexThread + updateJobSize < total) { POOL.execute(new UpdaterRunnable(iChecker, jobs.subList(indexThread, indexThread + updateJobSize), latch)); } else { POOL.execute(new UpdaterRunnable(iChecker, jobs.subList(indexThread, total), latch)); } - indexThread += updateJobSize; updateThreadSize --; } if (total > 0 && updateThreadSize == 0) { @@ -105,21 +105,17 @@ public void run() { LOGGER.info("Finish to monitor application."); } catch (Exception e) { LOGGER.error("Failed to monitor application, caused by: {}", e.getMessage(), e); - } finally { - abstractServiceCoordinator.release(); } } - - private static final List END_APPLICATION_STATUS_LIST = Arrays.asList(ApplicationStatusEnum.FINISHED.getCode(), - ApplicationStatusEnum.FAILED.getCode(), - ApplicationStatusEnum.NOT_PASS.getCode(), - ApplicationStatusEnum.ARGUMENT_NOT_CORRECT.getCode(), - ApplicationStatusEnum.TASK_SUBMIT_FAILED.getCode()); + private static final List NOT_END_APPLICATION_STATUS_LIST = Arrays.asList(ApplicationStatusEnum.SUBMITTED.getCode(), + ApplicationStatusEnum.RUNNING.getCode(), + ApplicationStatusEnum.SUCCESSFUL_CREATE_APPLICATION.getCode()); private static final List NOT_END_TASK_STATUS_LIST = Arrays.asList(TaskStatusEnum.SUBMITTED.getCode(), TaskStatusEnum.INITED.getCode(), TaskStatusEnum.RUNNING.getCode(), TaskStatusEnum.SCHEDULED.getCode()); private List getJobs() { - List notEndApplications = applicationDao.findByStatusNotIn(END_APPLICATION_STATUS_LIST); + List notEndApplications = applicationDao.findByStatusIn(NOT_END_APPLICATION_STATUS_LIST); + notEndApplications = notEndApplications.stream().filter(application -> ip.equals(application.getIp())).collect(Collectors.toList()); List jobCheckers = new ArrayList<>(); for (Application app : notEndApplications) { // Find not end task @@ -132,7 +128,12 @@ private List getJobs() { if (notEndTasks.isEmpty()) { LOGGER.info("Find abnormal application, which tasks is all end, but application is not end."); List allTasks = taskDao.findByApplication(app); - + if (CollectionUtils.isEmpty(allTasks)) { + LOGGER.info("Find abnormal application, which has no tasks, finish it with failed status."); + app.setStatus(ApplicationStatusEnum.FAILED.getCode()); + applicationDao.saveApplication(app); + continue; + } app.resetTask(); applicationDao.saveApplication(app); LOGGER.info("Finish to reset application status num."); @@ -142,16 +143,16 @@ private List getJobs() { for (Task task : allTasks) { if (task.getStatus().equals(TaskStatusEnum.FAILED.getCode())) { iChecker.checkIfLastJob(app, false, false, false); - } else if (task.getAbortOnFailure() != null && !task.getAbortOnFailure() && task.getStatus() - .equals(TaskStatusEnum.FAIL_CHECKOUT.getCode())) { + } else if (task.getStatus().equals(TaskStatusEnum.CANCELLED.getCode())) { + ApplicationComment applicationComment = SpringContextHolder.getBean(ApplicationCommentDao.class).getByCode(ApplicationCommentEnum.TIMEOUT_KILL.getCode()); + app.setApplicationComment(applicationComment != null ? applicationComment.getCode() : null); + iChecker.checkIfLastJob(app, false, false, false); + } else if (task.getStatus().equals(TaskStatusEnum.FAIL_CHECKOUT.getCode())) { iChecker.checkIfLastJob(app, true, false, false); } else if (task.getStatus().equals(TaskStatusEnum.PASS_CHECKOUT.getCode())) { iChecker.checkIfLastJob(app, true, true, false); } else if (task.getStatus().equals(TaskStatusEnum.TASK_NOT_EXIST.getCode())) { iChecker.checkIfLastJob(app, false, false, true); - } else if (task.getStatus().equals(TaskStatusEnum.CANCELLED.getCode())) { - app.setApplicationComment(ApplicationCommentEnum.TIMEOUT_KILL.getCode()); - iChecker.checkIfLastJob(app, false, false, false); } } LOGGER.info("Succeed to recover application status."); diff --git a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/IChecker.java b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/IChecker.java index 1facb11b..e0eb2b62 100644 --- a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/IChecker.java +++ b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/IChecker.java @@ -40,4 +40,9 @@ public interface IChecker { * @param isNotExist */ void checkIfLastJob(Application applicationInDb, boolean finish, boolean isPass, boolean isNotExist); + + /** + * Abnormal data record alarm. + */ + void abnormalDataRecordAlarm(); } diff --git a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/JobCheckerTimer.java b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/JobCheckerTimer.java index db0a1bf4..17ef3f92 100644 --- a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/JobCheckerTimer.java +++ b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/JobCheckerTimer.java @@ -16,11 +16,14 @@ package com.webank.wedatasphere.qualitis.timer; +import com.webank.wedatasphere.qualitis.config.ImsConfig; import com.webank.wedatasphere.qualitis.config.ThreadPoolConfig; +import com.webank.wedatasphere.qualitis.constants.QualitisConstants; import com.webank.wedatasphere.qualitis.dao.ApplicationDao; import com.webank.wedatasphere.qualitis.dao.TaskDao; import com.webank.wedatasphere.qualitis.ha.AbstractServiceCoordinator; +import com.webank.wedatasphere.qualitis.util.AlarmUtil; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.scheduling.concurrent.ThreadPoolTaskScheduler; import org.springframework.scheduling.support.CronTrigger; @@ -42,6 +45,8 @@ public class JobCheckerTimer { @Autowired private ApplicationDao applicationDao; @Autowired + private ImsConfig imsConfig; + @Autowired private TaskDao taskDao; @Autowired private IChecker iChecker; @@ -52,8 +57,15 @@ public class JobCheckerTimer { public void init() { ScheduledExecutorService executor = new ScheduledThreadPoolExecutor(threadPoolConfig.getSize(), new MonitoryThreadFactory()); executor.scheduleWithFixedDelay( - new CheckerRunnable(applicationDao, taskDao, iChecker, abstractServiceCoordinator, threadPoolConfig.getUpdateJobSize()), + new CheckerRunnable(applicationDao, taskDao, iChecker, threadPoolConfig.getUpdateJobSize(), QualitisConstants.QUALITIS_SERVER_HOST), 0, threadPoolConfig.getPeriod(), TimeUnit.MILLISECONDS); + + if (threadPoolConfig.getAbnormalDataRedordAlarmCronEnable()) { + ThreadPoolTaskScheduler threadPoolTaskScheduler = new ThreadPoolTaskScheduler(); + threadPoolTaskScheduler.initialize(); + threadPoolTaskScheduler.schedule(new UploaderRunnable(iChecker, abstractServiceCoordinator) + , new CronTrigger(threadPoolConfig.getAbnormalDataRedordAlarmCron())); + } } } diff --git a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/TaskChecker.java b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/TaskChecker.java index 30edef5d..6f0b8308 100644 --- a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/TaskChecker.java +++ b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/TaskChecker.java @@ -16,39 +16,67 @@ package com.webank.wedatasphere.qualitis.timer; +import com.google.common.collect.Lists; +import com.google.common.collect.Sets; import com.webank.wedatasphere.qualitis.bean.JobChecker; +import com.webank.wedatasphere.qualitis.client.AlarmClient; +import com.webank.wedatasphere.qualitis.config.ImsConfig; import com.webank.wedatasphere.qualitis.config.LinkisConfig; import com.webank.wedatasphere.qualitis.constant.AlarmConfigStatusEnum; import com.webank.wedatasphere.qualitis.constant.ApplicationCommentEnum; import com.webank.wedatasphere.qualitis.constant.ApplicationStatusEnum; +import com.webank.wedatasphere.qualitis.constant.ImsLevelEnum; import com.webank.wedatasphere.qualitis.constant.TaskStatusEnum; +import com.webank.wedatasphere.qualitis.constants.QualitisConstants; +import com.webank.wedatasphere.qualitis.dao.AbnormalDataRecordInfoDao; +import com.webank.wedatasphere.qualitis.dao.AlarmInfoDao; +import com.webank.wedatasphere.qualitis.dao.ApplicationCommentDao; import com.webank.wedatasphere.qualitis.dao.ApplicationDao; +import com.webank.wedatasphere.qualitis.dao.LinksErrorCodeDao; import com.webank.wedatasphere.qualitis.dao.RuleMetricDao; import com.webank.wedatasphere.qualitis.dao.TaskDao; import com.webank.wedatasphere.qualitis.dao.TaskDataSourceDao; import com.webank.wedatasphere.qualitis.dao.TaskResultDao; +import com.webank.wedatasphere.qualitis.dao.TaskResultStatusDao; import com.webank.wedatasphere.qualitis.dao.TaskRuleSimpleDao; +import com.webank.wedatasphere.qualitis.dao.UploadRecordDao; import com.webank.wedatasphere.qualitis.dao.UserDao; +import com.webank.wedatasphere.qualitis.entity.AbnormalDataRecordInfo; import com.webank.wedatasphere.qualitis.entity.Application; +import com.webank.wedatasphere.qualitis.entity.ApplicationComment; +import com.webank.wedatasphere.qualitis.entity.LinksErrorCode; +import com.webank.wedatasphere.qualitis.entity.ReportBatchInfo; +import com.webank.wedatasphere.qualitis.entity.RuleMetric; import com.webank.wedatasphere.qualitis.entity.Task; +import com.webank.wedatasphere.qualitis.entity.TaskDataSource; import com.webank.wedatasphere.qualitis.entity.TaskResult; +import com.webank.wedatasphere.qualitis.entity.TaskResultStatus; import com.webank.wedatasphere.qualitis.entity.TaskRuleAlarmConfig; import com.webank.wedatasphere.qualitis.entity.TaskRuleSimple; +import com.webank.wedatasphere.qualitis.entity.UploadRecord; import com.webank.wedatasphere.qualitis.exception.ClusterInfoNotConfigException; import com.webank.wedatasphere.qualitis.exception.JobKillException; import com.webank.wedatasphere.qualitis.exception.TaskNotExistException; import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; import com.webank.wedatasphere.qualitis.job.MonitorManager; +import com.webank.wedatasphere.qualitis.rule.constant.NoiseStrategyEnum; import com.webank.wedatasphere.qualitis.rule.constant.RuleTemplateTypeEnum; +import com.webank.wedatasphere.qualitis.rule.constant.TemplateDataSourceTypeEnum; +import com.webank.wedatasphere.qualitis.rule.dao.ExecutionParametersDao; import com.webank.wedatasphere.qualitis.rule.dao.RuleDao; import com.webank.wedatasphere.qualitis.rule.dao.RuleGroupDao; +import com.webank.wedatasphere.qualitis.rule.entity.AlarmArgumentsExecutionParameters; +import com.webank.wedatasphere.qualitis.rule.entity.ExecutionParameters; +import com.webank.wedatasphere.qualitis.rule.entity.NoiseEliminationManagement; +import com.webank.wedatasphere.qualitis.rule.entity.Rule; +import com.webank.wedatasphere.qualitis.rule.response.CheckConditionsResponse; import com.webank.wedatasphere.qualitis.submitter.ExecutionManager; +import com.webank.wedatasphere.qualitis.util.AlarmUtil; import com.webank.wedatasphere.qualitis.util.PassUtil; -import java.util.Date; -import java.util.HashMap; -import java.util.List; -import java.util.Map; -import java.util.stream.Collectors; +import com.webank.wedatasphere.qualitis.util.ReportUtil; +import com.webank.wedatasphere.qualitis.util.map.CustomObjectMapper; +import org.apache.commons.collections.CollectionUtils; +import org.apache.commons.lang3.StringUtils; import org.joda.time.DateTime; import org.joda.time.format.DateTimeFormat; import org.joda.time.format.DateTimeFormatter; @@ -56,9 +84,26 @@ import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Component; +import org.springframework.transaction.annotation.Propagation; import org.springframework.transaction.annotation.Transactional; import org.springframework.web.client.ResourceAccessException; +import javax.annotation.PostConstruct; +import java.math.BigDecimal; +import java.net.InetAddress; +import java.net.UnknownHostException; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.Calendar; +import java.util.Date; +import java.util.HashMap; +import java.util.HashSet; +import java.util.Iterator; +import java.util.List; +import java.util.Map; +import java.util.Set; +import java.util.stream.Collectors; + /** * @author howeye */ @@ -81,80 +126,72 @@ public class TaskChecker implements IChecker { @Autowired private RuleDao ruleDao; @Autowired + private ImsConfig imsConfig; + @Autowired private LinkisConfig linkisConfig; @Autowired + private UploadRecordDao uploadRecordDao; + @Autowired private ExecutionManager executionManager; @Autowired private TaskRuleSimpleDao taskRuleSimpleDao; @Autowired + private AbnormalDataRecordInfoDao abnormalDataRecordInfoDao; + @Autowired + private AlarmInfoDao alarmInfoDao; + @Autowired + private AlarmClient alarmClient; + @Autowired private UserDao userDao; + @Autowired + private TaskResultStatusDao taskResultStatusDao; + @Autowired + private ExecutionParametersDao executionParametersDao; + @Autowired + private LinksErrorCodeDao linksErrorCodeDao; + @Autowired + private ApplicationCommentDao applicationCommentDao; + private static final int BATCH_ABNORMAL_DATA_RECORD = 500; private static final String PRINT_TIME_PATTERN = "yyyy-MM-dd HH:mm:ss"; private static final Logger LOGGER = LoggerFactory.getLogger(TaskChecker.class); private static final DateTimeFormatter PRINT_TIME_FORMAT = DateTimeFormat.forPattern(PRINT_TIME_PATTERN); - private static final Map ERR_CODE_TYPE = new HashMap(){{ - put(60075,2); - put(10001,2); - put(20001,3); - put(20002,3); - put(20003,3); - put(20083,3); - put(70059,3); - put(11011,3); - put(11012,3); - put(11013,3); - put(11014,3); - put(11015,3); - put(11016,3); - put(11017,3); - put(60035,3); - put(21304,3); - put(30001,1); - put(60010,1); - put(40001,4); - put(40002,10); - put(40004,4); - put(40003,4); - put(40005,4); - put(50001,4); - put(50002,4); - put(50003,4); - put(50004,4); - put(50005,4); - put(50007,4); - put(50012,4); - put(50013,4); - put(50014,4); - put(50017,4); - put(50019,4); - put(60003,4); - put(11017,3); - put(60035,3); - put(60075,2); - put(95002,1); - put(95003,1); - put(95004,1); - put(95006,1); - put(60079,1); - put(30002,5); - put(50007,4); - }}; + private static final Map ERR_CODE_TYPE = new HashMap(); + + private static final List APPLICATION_COMMENT_LIST = Lists.newArrayList(); + + @PostConstruct + public void init() { + List allLinksErrorCode = linksErrorCodeDao.findAllLinksErrorCode(); + if (CollectionUtils.isNotEmpty(allLinksErrorCode)) { + for (LinksErrorCode linksErrorCode : allLinksErrorCode) { + ERR_CODE_TYPE.put(linksErrorCode.getLinkisErrorCode(), linksErrorCode.getApplicationComment()); + } + } + + List allApplicationComment = applicationCommentDao.findAllApplicationComment(); + if (CollectionUtils.isNotEmpty(allApplicationComment)) { + APPLICATION_COMMENT_LIST.addAll(allApplicationComment); + } + + } + @Override @Transactional(rollbackFor = Exception.class) public void checkTaskStatus(JobChecker jobChecker) { try { Map taskInfos = monitorManager.getTaskStatus(jobChecker.getTaskId(), jobChecker.getUsername(), - jobChecker.getUjesAddress(), jobChecker.getClusterName()); + jobChecker.getUjesAddress(), jobChecker.getClusterName()); String jobStatus = ((String) taskInfos.get("status")).toUpperCase(); Integer errCode = (Integer) taskInfos.get("errCode"); LOGGER.info("Task status: {}", jobStatus); - if (! jobStatus.equals(jobChecker.getOldStatus())) { - LOGGER.info("Start to update task status. old status: {}, new status: {}, task_id: {}", jobChecker.getOldStatus(), jobStatus, jobChecker.getTaskId()); + if (!jobStatus.equals(jobChecker.getOldStatus())) { + LOGGER.info("Start to update task status. old status: {}, new status: {}, task ID: {}", jobChecker.getOldStatus(), jobStatus, jobChecker.getTaskId()); writeDb(jobChecker, jobStatus, errCode); - LOGGER.info("Succeed to update task status. old status: {}, new status: {}, task_id: {}", jobChecker.getOldStatus(), jobStatus, jobChecker.getTaskId()); + LOGGER.info("Succeed to update task status. old status: {}, new status: {}, task ID: {}", jobChecker.getOldStatus(), jobStatus, jobChecker.getTaskId()); } // Compute task time in same progress. @@ -165,19 +202,21 @@ public void checkTaskStatus(JobChecker jobChecker) { LOGGER.info("Current time progress[{}].", progress); long runningTime = System.currentTimeMillis() - taskInDb.getRunningTime(); LOGGER.info("Current task running time [{}] minutes.", runningTime / (60 * 1000)); - if (progress.equals(jobChecker.getOldProgress())) { + //1.等于0是相等 2.>0前者大于后者 3.反之 <0 前者小于后者 + if (BigDecimal.valueOf(progress).compareTo(BigDecimal.valueOf(jobChecker.getOldProgress())) == 0) { long diff = System.currentTimeMillis() - taskInDb.getNewProgressTime(); long diffMinutes = diff; LOGGER.info("Time in same progress[{}]: {} minutes. Config max time: {} minutes.", progress, diffMinutes / (60 * 1000) - , linkisConfig.getKillStuckTasksTime().longValue() / (60 * 1000)); + , linkisConfig.getKillStuckTasksTime().longValue() / (60 * 1000)); if (diffMinutes > linkisConfig.getKillStuckTasksTime().longValue()) { - killTimeoutTask(applicationDao.findById(jobChecker.getApplicationId()),taskInDb, jobChecker); + killTimeoutTask(applicationDao.findById(jobChecker.getApplicationId()), taskInDb, jobChecker); } } else { - LOGGER.info("Progress is updating , so is task new progress."); + LOGGER.info("Progress is updating, so is task new progress."); taskInDb.setNewProgressTime(System.currentTimeMillis()); taskInDb.setProgress(progress); - if (runningTime > linkisConfig.getKillStuckTasksTime().longValue()) { + if (runningTime > linkisConfig.getKillTotalTasksTime().longValue()) { + LOGGER.info("Running cost time more than doubled config max time : {} minutes.", linkisConfig.getKillTotalTasksTime().longValue() / (60 * 1000)); killTimeoutTask(applicationDao.findById(jobChecker.getApplicationId()), taskInDb, jobChecker); } } @@ -222,10 +261,10 @@ public void checkIfLastJob(Application applicationInDb, boolean finish, boolean applicationInDb.addNotPassTaskNum(); LOGGER.info("Application add not pass task, application: {}", applicationInDb); } - } else if (! isNotExist) { + } else if (!isNotExist) { applicationInDb.addFailJobNum(); LOGGER.info("Application add failed task, application: {}", applicationInDb); - } else if (isNotExist) { + } else { applicationInDb.addAbnormalTaskNum(); LOGGER.info("Application add abnormal task, application: {}", applicationInDb); } @@ -235,16 +274,20 @@ public void checkIfLastJob(Application applicationInDb, boolean finish, boolean private void writeDb(JobChecker jobChecker, String newStatus, Integer errCode) { Task taskInDb = taskDao.findByRemoteTaskIdAndClusterName(jobChecker.getTaskId(), jobChecker.getClusterName()); Application applicationInDb = applicationDao.findById(jobChecker.getApplicationId()); + if (newStatus.equals(TaskStatusEnum.FAILED.getState())) { /* * 1.Modify end time of job * 2.Modify task finish time and failed num if last job * */ taskInDb.setEndTime(new DateTime(new Date()).toString(PRINT_TIME_FORMAT)); - taskInDb.setTaskComment(errCode == null ? ApplicationCommentEnum.UNKNOWN_ERROR_ISSUES.getCode() : ERR_CODE_TYPE.get(errCode)); + List collect = APPLICATION_COMMENT_LIST.stream().filter(item -> item.getCode().toString().equals(ApplicationCommentEnum.UNKNOWN_ERROR_ISSUES.getCode().toString())).collect(Collectors.toList()); + Integer applicationCommentCode = CollectionUtils.isNotEmpty(collect) ? collect.get(0).getCode() : null; + + taskInDb.setTaskComment(errCode == null ? applicationCommentCode : ERR_CODE_TYPE.get(String.valueOf(errCode))); modifyJobStatus(taskInDb, newStatus); taskDao.save(taskInDb); - applicationInDb.setApplicationComment(errCode == null ? ApplicationCommentEnum.UNKNOWN_ERROR_ISSUES.getCode() : ERR_CODE_TYPE.get(errCode)); + applicationInDb.setApplicationComment(errCode == null ? applicationCommentCode : ERR_CODE_TYPE.get(String.valueOf(errCode))); checkIfLastJob(applicationInDb, false, false, false); } else if (newStatus.equals(TaskStatusEnum.SUCCEED.getState())) { /* @@ -259,10 +302,12 @@ private void writeDb(JobChecker jobChecker, String newStatus, Integer errCode) { isPass = true; finish = true; } else { - if (taskInDb.getAbortOnFailure() != null && taskInDb.getAbortOnFailure()) { + if (Boolean.FALSE.equals(checkWhetherBlocked(taskInDb)) && Boolean.TRUE.equals(taskInDb.getAbortOnFailure())) { modifyJobStatus(taskInDb, TaskStatusEnum.FAILED.getState()); - taskInDb.setTaskComment(ApplicationCommentEnum.DIFF_DATA_ISSUES.getCode()); - applicationInDb.setApplicationComment(ApplicationCommentEnum.DIFF_DATA_ISSUES.getCode()); + List collect = APPLICATION_COMMENT_LIST.stream().filter(item -> item.getCode().toString().equals(ApplicationCommentEnum.DIFF_DATA_ISSUES.getCode().toString())).collect(Collectors.toList()); + Integer applicationCommentCode = CollectionUtils.isNotEmpty(collect) ? collect.get(0).getCode() : null; + + applicationInDb.setApplicationComment(applicationCommentCode); finish = false; } else { modifyJobStatus(taskInDb, TaskStatusEnum.FAIL_CHECKOUT.getState()); @@ -275,7 +320,10 @@ private void writeDb(JobChecker jobChecker, String newStatus, Integer errCode) { } else if (newStatus.equals(TaskStatusEnum.CANCELLED.getState())) { modifyJobStatus(taskInDb, newStatus); taskDao.save(taskInDb); - applicationInDb.setApplicationComment(ApplicationCommentEnum.TIMEOUT_KILL.getCode()); + List collect = APPLICATION_COMMENT_LIST.stream().filter(item -> item.getCode().toString().equals(ApplicationCommentEnum.TIMEOUT_KILL.getCode().toString())).collect(Collectors.toList()); + Integer applicationCommentCode = CollectionUtils.isNotEmpty(collect) ? collect.get(0).getCode() : null; + + applicationInDb.setApplicationComment(applicationCommentCode); checkIfLastJob(applicationInDb, false, false, false); } else { modifyJobStatus(taskInDb, newStatus); @@ -309,17 +357,13 @@ private void modifyJobStatus(Task task, String newStatus) { } else if (newStatus.equals(TaskStatusEnum.PASS_CHECKOUT.getState())) { task.setStatus(TaskStatusEnum.PASS_CHECKOUT.getCode()); task.setProgress(Double.parseDouble("1")); - task.setTaskComment(ApplicationCommentEnum.SAME_ISSUES.getCode()); } else if (newStatus.equals(TaskStatusEnum.FAIL_CHECKOUT.getState())) { task.setStatus(TaskStatusEnum.FAIL_CHECKOUT.getCode()); task.setProgress(Double.parseDouble("1")); - task.setTaskComment(ApplicationCommentEnum.DIFF_DATA_ISSUES.getCode()); } else if (newStatus.equals(TaskStatusEnum.CANCELLED.getState())) { task.setStatus(TaskStatusEnum.CANCELLED.getCode()); - task.setTaskComment(ApplicationCommentEnum.TIMEOUT_KILL.getCode()); } else if (newStatus.equals(TaskStatusEnum.TIMEOUT.getState())) { task.setStatus(TaskStatusEnum.TIMEOUT.getCode()); - task.setTaskComment(ApplicationCommentEnum.TIMEOUT_KILL.getCode()); } else if (newStatus.equals(TaskStatusEnum.SCHEDULED.getState())) { task.setStatus(TaskStatusEnum.SCHEDULED.getCode()); } else { @@ -331,15 +375,9 @@ private void modifyJobStatus(Task task, String newStatus) { private Boolean passCheckOut(String applicationId, Task task) { Boolean passFlag = true; for (TaskRuleSimple taskRuleSimple : task.getTaskRuleSimples()) { - if (! checkTaskRuleSimplePass(applicationId, taskRuleSimple)) { + if (!checkTaskRuleSimplePass(applicationId, taskRuleSimple)) { passFlag = false; } - - if (taskRuleSimple.getChildRuleSimple() != null) { - if (! checkTaskRuleSimplePass(applicationId, taskRuleSimple.getChildRuleSimple())) { - passFlag = false; - } - } } return passFlag; @@ -349,37 +387,58 @@ private Boolean passCheckOut(String applicationId, Task task) { private Boolean checkTaskRuleSimplePass(String applicationId, TaskRuleSimple taskRuleSimple) { Boolean passFlag = true; List taskResults = taskResultDao.findByApplicationAndRule(applicationId, taskRuleSimple.getRuleId()); + if (CollectionUtils.isEmpty(taskResults)) { + return false; + } + List taskResultStatusList = Lists.newArrayList(); for (TaskResult taskResult : taskResults) { List taskRuleAlarmConfigList = taskRuleSimple.getTaskRuleAlarmConfigList(); Long ruleMetricId = taskResult.getRuleMetricId(); if (ruleMetricId != null && ruleMetricId.longValue() != -1) { taskRuleAlarmConfigList = taskRuleAlarmConfigList.stream().filter(taskRuleAlarmConfig -> - taskRuleAlarmConfig.getRuleMetric().getId().equals(ruleMetricId) + taskRuleAlarmConfig.getRuleMetric().getId().equals(ruleMetricId) ).collect(Collectors.toList()); } +// 遍历校验预期 for (TaskRuleAlarmConfig taskRuleAlarmConfig : taskRuleAlarmConfigList) { - if (PassUtil.notSafe(applicationId, taskRuleSimple.getRuleId(), taskRuleAlarmConfig, taskResult, taskResultDao)) { - taskRuleAlarmConfig.setStatus(AlarmConfigStatusEnum.PASS.getCode()); + TaskResultStatus taskResultStatus = new TaskResultStatus(); + taskResultStatus.setApplicationId(applicationId); + taskResultStatus.setRuleId(taskRuleSimple.getRuleId()); + taskResultStatus.setTaskResult(taskResult); + taskResultStatus.setTaskRuleAlarmConfigId(taskRuleAlarmConfig.getId()); + taskResultStatusList.add(taskResultStatus); + if (AlarmConfigStatusEnum.NOT_PASS.getCode().equals(taskRuleAlarmConfig.getStatus())) { + taskResultStatus.setStatus(AlarmConfigStatusEnum.NOT_PASS.getCode()); + } else if (AlarmConfigStatusEnum.PASS.getCode().equals(taskRuleAlarmConfig.getStatus())) { + taskResultStatus.setStatus(AlarmConfigStatusEnum.PASS.getCode()); } else { - passFlag = false; - taskRuleAlarmConfig.setStatus(AlarmConfigStatusEnum.NOT_PASS.getCode()); + Boolean passReal = PassUtil.notSafe(applicationId, taskRuleSimple.getRuleId(), taskRuleAlarmConfig, taskResult, taskResultDao); - if (taskRuleSimple.getRuleType().equals(RuleTemplateTypeEnum.CUSTOM.getCode())) { - if (taskRuleAlarmConfig.getDeleteFailCheckResult() != null && true == taskRuleAlarmConfig.getDeleteFailCheckResult().booleanValue()) { - taskResult.setSaveResult(false); - taskResultDao.saveTaskResult(taskResult); - } + if (passReal) { + taskRuleAlarmConfig.setStatus(AlarmConfigStatusEnum.PASS.getCode()); + taskResultStatus.setStatus(AlarmConfigStatusEnum.PASS.getCode()); } else { - if (taskRuleSimple.getDeleteFailCheckResult() != null && true == taskRuleSimple.getDeleteFailCheckResult().booleanValue()) { - taskResult.setSaveResult(false); - taskResultDao.saveTaskResult(taskResult); + taskResultStatus.setStatus(AlarmConfigStatusEnum.NOT_PASS.getCode()); + passFlag = false; + taskRuleAlarmConfig.setStatus(AlarmConfigStatusEnum.NOT_PASS.getCode()); + + if (taskRuleSimple.getRuleType().equals(RuleTemplateTypeEnum.CUSTOM.getCode()) + || taskRuleSimple.getRuleType().equals(RuleTemplateTypeEnum.FILE_COUSTOM.getCode())) { + if (taskRuleAlarmConfig.getDeleteFailCheckResult() != null && true == taskRuleAlarmConfig.getDeleteFailCheckResult().booleanValue()) { + taskResult.setSaveResult(false); + taskResultDao.saveTaskResult(taskResult); + } + } else { + if (taskRuleSimple.getDeleteFailCheckResult() != null && true == taskRuleSimple.getDeleteFailCheckResult().booleanValue()) { + taskResult.setSaveResult(false); + taskResultDao.saveTaskResult(taskResult); + } } } - } } } - + taskResultStatusDao.saveBatch(taskResultStatusList); return passFlag; } @@ -389,13 +448,20 @@ private void ifLastTaskAndSaveApplication(Application applicationInDb) { applicationInDb.setFinishTime(new DateTime(new Date()).toString(PRINT_TIME_FORMAT)); if (applicationInDb.getFinishTaskNum().equals(applicationInDb.getTotalTaskNum())) { applicationInDb.setStatus(ApplicationStatusEnum.FINISHED.getCode()); - applicationInDb.setApplicationComment(ApplicationCommentEnum.SAME_ISSUES.getCode()); - } else if (!applicationInDb.getFailTaskNum().equals(0) || !applicationInDb.getAbnormalTaskNum().equals(0)){ + List collect = APPLICATION_COMMENT_LIST.stream().filter(item -> item.getCode().toString().equals(ApplicationCommentEnum.SAME_ISSUES.getCode().toString())).collect(Collectors.toList()); + Integer applicationCommentCode = CollectionUtils.isNotEmpty(collect) ? collect.get(0).getCode() : null; + + applicationInDb.setApplicationComment(applicationCommentCode); + } else if (!applicationInDb.getFailTaskNum().equals(0) || !applicationInDb.getAbnormalTaskNum().equals(0)) { applicationInDb.setStatus(ApplicationStatusEnum.FAILED.getCode()); } else { applicationInDb.setStatus(ApplicationStatusEnum.NOT_PASS.getCode()); - applicationInDb.setApplicationComment(ApplicationCommentEnum.DIFF_DATA_ISSUES.getCode()); + List collect = APPLICATION_COMMENT_LIST.stream().filter(item -> item.getCode().toString().equals(ApplicationCommentEnum.DIFF_DATA_ISSUES.getCode().toString())).collect(Collectors.toList()); + Integer applicationCommentCode = CollectionUtils.isNotEmpty(collect) ? collect.get(0).getCode() : null; + applicationInDb.setApplicationComment(applicationCommentCode); } + checkIfSendAlarm(applicationInDb); + checkIfReport(applicationInDb, imsConfig); } if (applicationInDb.getTotalTaskNum() != null) { applicationDao.saveApplication(applicationInDb); @@ -405,10 +471,475 @@ private void ifLastTaskAndSaveApplication(Application applicationInDb) { private boolean isLastJob(Application application) { LOGGER.info("Calculate application num: application.getFinishTaskNum ADD application.getFailTaskNum ADD application.getNotPassTaskNum ADD application.getAbnormalTaskNum = {} ADD {} ADD {} ADD {} vs application.getTotalTaskNum == {}", - application.getFinishTaskNum(), application.getFailTaskNum(), application.getNotPassTaskNum(), application.getAbnormalTaskNum(), application.getTotalTaskNum()); + application.getFinishTaskNum(), application.getFailTaskNum(), application.getNotPassTaskNum(), application.getAbnormalTaskNum(), application.getTotalTaskNum()); if (application.getTotalTaskNum() == null) { return false; } return application.getFinishTaskNum() + application.getFailTaskNum() + application.getNotPassTaskNum() + application.getAbnormalTaskNum() == application.getTotalTaskNum(); } + + private void checkIfSendAlarm(Application application) { + LOGGER.info("Start to collect alarm info."); + List tasks = taskDao.findByApplication(application); + + List notPassTask = tasks.stream().filter(job -> job.getStatus().equals(TaskStatusEnum.FAIL_CHECKOUT.getCode())).collect(Collectors.toList()); + LOGGER.info("Succeed to collect failed pass tasks. Task ID: {}", notPassTask.stream().map(Task::getId).collect(Collectors.toList())); + List notPassTaskRuleSimples = AlarmUtil.notSafeTaskRuleSimple(notPassTask); + + List failedTask = tasks.stream().filter(job -> job.getStatus().equals(TaskStatusEnum.FAILED.getCode()) || job.getStatus().equals(TaskStatusEnum.CANCELLED.getCode())).collect(Collectors.toList()); + LOGGER.info("Succeed to collect failed tasks. Task ID: {}", failedTask.stream().map(Task::getId).collect(Collectors.toList())); + List failedTaskRuleSimples = AlarmUtil.getFailedTaskRule(failedTask); + for (Iterator taskRuleSimpleIterator = failedTaskRuleSimples.iterator(); taskRuleSimpleIterator.hasNext(); ) { + TaskRuleSimple taskRuleSimple = taskRuleSimpleIterator.next(); + List taskRuleAlarmConfigList = taskRuleSimple.getTaskRuleAlarmConfigList(); + int count = (int) taskRuleAlarmConfigList.stream().filter(o -> !AlarmConfigStatusEnum.PASS.getCode().equals(o.getStatus())).count(); + if (0 == count) { + taskRuleSimpleIterator.remove(); + } + } + + // 是否告警都跟配置的告警事件来判断 AlarmEventEnum a. only pass b. task failed, not pass + abort, not pass + not abort) c. pass, not pass + abort, not pass + not abort + // CHECK_SUCCESS 校验成功、CHECK_FAILURE 校验失败、EXECUTION_COMPLETED 执行完成 + List alreadyAlertApp = new ArrayList<>(); + for (Task task : tasks) { + Set taskRuleSimpleCollect = task.getTaskRuleSimples(); + for (TaskRuleSimple taskRuleSimple : taskRuleSimpleCollect) { + Rule rule = ruleDao.findById(taskRuleSimple.getRuleId()); + if (rule != null && StringUtils.isNotBlank(rule.getExecutionParametersName())) { + ExecutionParameters executionParameters = executionParametersDao.findByNameAndProjectId(rule.getExecutionParametersName(), rule.getProject().getId()); + if (executionParameters != null) { + // strategy: 1只告警不阻断, 2不告警不阻断 checkEligible(匹配告警参数对象,1 run_date 符合 2 模板ID符合 3 校验条件符合) + // 当strategy不为null且级别等于1或2,配置去噪参数,都是不阻断的(前提匹配上去噪参数),取校验条件的并集,决定是否属于去噪范围 + Integer strategy = checkEligible(rule, executionParameters, taskRuleSimple); + + boolean startAlarm = (Boolean.FALSE.equals(executionParameters.getWhetherNoise())) || (strategy != null && strategy.equals(NoiseStrategyEnum.ALARM_ONLY_NO_BLOCKING.getCode())) || (strategy == null); + if (startAlarm) { + // 无告警事件 + if (executionParameters.getAlertLevel() != null && StringUtils.isNotBlank(executionParameters.getAlertReceiver())) { + if (!alreadyAlertApp.contains(application.getId())) { + List taskRuleSimples = notPassTaskRuleSimples.stream().filter(taskRuleSimpleTemp -> taskRuleSimpleTemp.getAlertLevel() != null).collect(Collectors.toList()); + handleCheckFailure(alreadyAlertApp, application, taskRuleSimples, null, null, null); + } + if (!alreadyAlertApp.contains(application.getId())) { + List taskRuleSimples = failedTaskRuleSimples.stream().filter(taskRuleSimpleTemp -> taskRuleSimpleTemp.getAlertLevel() != null).collect(Collectors.toList()); + handleTaskFailure(alreadyAlertApp, application, taskRuleSimples, null, null, null); + } + continue; + } + // 有告警事件 + if (CollectionUtils.isNotEmpty(executionParameters.getAlarmArgumentsExecutionParameters())) { + for (AlarmArgumentsExecutionParameters parameters : executionParameters.getAlarmArgumentsExecutionParameters()) { + if (QualitisConstants.CHECK_SUCCESS.toString().equals(parameters.getAlarmEvent().toString())) { + // a. only pass + handleCheckSuccess(application, task, taskRuleSimple, parameters.getAlarmLevel(), parameters.getAlarmReceiver()); + } else if (QualitisConstants.CHECK_FAILURE.toString().equals(parameters.getAlarmEvent().toString())) { + // b. task failed, not pass + abort + handleTaskFailure(alreadyAlertApp, application, failedTaskRuleSimples, taskRuleSimple, parameters.getAlarmLevel(), parameters.getAlarmReceiver()); + // b. not pass + not abort + handleCheckFailure(alreadyAlertApp, application, notPassTaskRuleSimples, taskRuleSimple, parameters.getAlarmLevel(), parameters.getAlarmReceiver()); + } else if (QualitisConstants.EXECUTION_COMPLETED.toString().equals(parameters.getAlarmEvent().toString())) { + // c. pass + handleCheckSuccess(application, task, taskRuleSimple, parameters.getAlarmLevel(), parameters.getAlarmReceiver()); + // c. not pass + abort + handleTaskFailureDueToAbort(application, failedTaskRuleSimples, parameters.getAlarmLevel(), parameters.getAlarmReceiver()); + // c. not pass + not abort + handleCheckFailure(alreadyAlertApp, application, notPassTaskRuleSimples, taskRuleSimple, parameters.getAlarmLevel(), parameters.getAlarmReceiver()); + } + } + } + + } + } + } else { + if ((null != rule && Boolean.TRUE.equals(rule.getAlert())) || (null != taskRuleSimple.getAlertLevel() && StringUtils.isNotEmpty(taskRuleSimple.getAlertReceiver()))) { + if (!alreadyAlertApp.contains(application.getId())) { + List taskRuleSimples = notPassTaskRuleSimples.stream().filter(taskRuleSimpleTemp -> taskRuleSimpleTemp.getAlertLevel() != null).collect(Collectors.toList()); + handleCheckFailure(alreadyAlertApp, application, taskRuleSimples, null, null, null); + } + if (!alreadyAlertApp.contains(application.getId())) { + List taskRuleSimples = failedTaskRuleSimples.stream().filter(taskRuleSimpleTemp -> taskRuleSimpleTemp.getAlertLevel() != null).collect(Collectors.toList()); + handleTaskFailure(alreadyAlertApp, application, taskRuleSimples, null, null, null); + } + } + } + } + } + handleAbnormalDataRecord(tasks); + } + + /** + * not pass + abort(校验不通过+阻断) + * + * @param application + * @param failedTaskRuleSimples + * @param alertRank + * @param alertReceiver + */ + private void handleTaskFailureDueToAbort(Application application, List failedTaskRuleSimples, Integer alertRank, String alertReceiver) { + if (!application.getFailTaskNum().equals(0)) { + LOGGER.info("Start to filter not pass + abort task to alarm."); + if (CollectionUtils.isNotEmpty(failedTaskRuleSimples)) { + int notCheckNum = failedTaskRuleSimples.stream() + .map(taskRuleSimple -> taskRuleSimple.getTaskRuleAlarmConfigList()) + .flatMap(taskRuleAlarmConfigList -> taskRuleAlarmConfigList.stream()) + .filter(taskRuleAlarmConfig -> AlarmConfigStatusEnum.NOT_CHECK.getCode().equals(taskRuleAlarmConfig.getStatus())) + .collect(Collectors.toList()).size(); + LOGGER.info("Task has not check num is : " + notCheckNum); + if (notCheckNum != 0) { + return; + } + AlarmUtil.sendFailedMessage(application, failedTaskRuleSimples, imsConfig, alarmClient, alarmInfoDao, userDao, alertRank, alertReceiver); + } + } + } + + /** + * 异常数据告警收集,含指标且已配置告警的规则 + * + * @param tasks + */ + private void handleAbnormalDataRecord(List tasks) { + List abnormalDataRecordInfoList = new ArrayList<>(tasks.size()); + for (Task task : tasks) { + Set taskRuleSimpleSet = task.getTaskRuleSimples(); + for (TaskRuleSimple taskRuleSimple : taskRuleSimpleSet) { + if (taskRuleSimple.getAlertLevel() == null) { + continue; + } + List taskRuleAlarmConfigList = taskRuleSimple.getTaskRuleAlarmConfigList(); + List ruleMetricList = taskRuleAlarmConfigList.stream().map(TaskRuleAlarmConfig::getRuleMetric) + .filter(ruleMetric -> ruleMetric != null).distinct().collect(Collectors.toList()); + if (CollectionUtils.isEmpty(ruleMetricList)) { + continue; + } + try { + constructAbnormalDataRecordInfo(task, taskRuleSimple, ruleMetricList, abnormalDataRecordInfoList); + } catch (Exception e) { + LOGGER.error("Construct abnormal data failed due to " + e.getMessage()); + } + } + } + LOGGER.info("Abnormal data record info: {}", Arrays.toString(abnormalDataRecordInfoList.toArray())); + } + + /** + * 处理校验成功 only pass + * + * @param application + * @param task + * @param taskRuleSimple + * @param alert + * @param alertReceiver + */ + private void handleCheckSuccess(Application application, Task task, TaskRuleSimple taskRuleSimple, Integer alert, String alertReceiver) { + if (application.getStatus().equals(ApplicationStatusEnum.FINISHED.getCode()) && task.getStatus().equals(TaskStatusEnum.PASS_CHECKOUT.getCode())) { + List safes = new ArrayList<>(); + safes.add(taskRuleSimple); + LOGGER.info("Succeed to collect check success simple rule. Simple rules: {}", safes); + AlarmUtil.sendAlarmMessage(application, safes, imsConfig, alarmClient, alarmInfoDao, userDao, taskResultStatusDao, alert, alertReceiver, true); + } + } + + + /** + * not pass + not abort(校验不通过+不阻断) + * + * @param alreadyAlertApp + * @param application + * @param notSafes + * @param currentTaskRuleSimple + */ + private void handleCheckFailure(List alreadyAlertApp, Application application, List notSafes, TaskRuleSimple currentTaskRuleSimple, Integer alertRank, String alertReceiver) { + if (!application.getNotPassTaskNum().equals(0)) { + if (null != currentTaskRuleSimple) { + if (notSafes.contains(currentTaskRuleSimple)) { + List taskRuleSimples = new ArrayList<>(); + taskRuleSimples.add(currentTaskRuleSimple); + AlarmUtil.sendAlarmMessage(application, taskRuleSimples, imsConfig, alarmClient, alarmInfoDao, userDao, taskResultStatusDao, alertRank, alertReceiver, false); + } + } else { + alreadyAlertApp.add(application.getId()); + AlarmUtil.sendAlarmMessage(application, notSafes, imsConfig, alarmClient, alarmInfoDao, userDao, taskResultStatusDao, alertRank, alertReceiver, false); + } + + } + } + + /** + * task failed, not pass + abort(任务失败、校验失败+阻断) + * + * @param alreadyAlertApp + * @param application + * @param failedTaskRuleSimples + * @param currentTaskRuleSimple + * @param alertRank + * @param alertReceiver + */ + private void handleTaskFailure(List alreadyAlertApp, Application application, List failedTaskRuleSimples, TaskRuleSimple currentTaskRuleSimple, Integer alertRank, String alertReceiver) { + if (!application.getFailTaskNum().equals(0)) { + if (null != currentTaskRuleSimple) { + if (failedTaskRuleSimples.contains(currentTaskRuleSimple)) { + List taskRuleSimples = new ArrayList<>(); + taskRuleSimples.add(currentTaskRuleSimple); + AlarmUtil.sendFailedMessage(application, taskRuleSimples, imsConfig, alarmClient, alarmInfoDao, userDao, alertRank, alertReceiver); + } + } else { + alreadyAlertApp.add(application.getId()); + AlarmUtil.sendFailedMessage(application, failedTaskRuleSimples, imsConfig, alarmClient, alarmInfoDao, userDao, alertRank, alertReceiver); + } + } + } + + private void constructAbnormalDataRecordInfo(Task task, TaskRuleSimple taskRuleSimple, List ruleMetricList, List abnormalDataRecordInfoList) { + RuleMetric currentRuleMetric = ruleMetricList.iterator().next(); + String departmentName = currentRuleMetric.getDevDepartmentName(); + Integer subSystemId = currentRuleMetric.getSubSystemId(); + if (null == subSystemId) { + subSystemId = QualitisConstants.SUB_SYSTEM_ID; + } + + int execNum = 1; + boolean collectTask = task.getStatus().equals(TaskStatusEnum.FAILED.getCode()) || task.getStatus().equals(TaskStatusEnum.FAIL_CHECKOUT.getCode()) || task.getStatus().equals(TaskStatusEnum.CANCELLED.getCode()); + int alarmNum = collectTask && taskRuleSimple.getAlertLevel() < Integer.parseInt(ImsLevelEnum.INFO.getCode()) ? 1 : 0; + + for (TaskDataSource taskDataSource : task.getTaskDataSources()) { + String dbName = taskDataSource.getDatabaseName(); + String tableName = taskDataSource.getTableName(); + long currentTime = System.currentTimeMillis(); + + String nowDate = QualitisConstants.PRINT_DATE_FORMAT.format(new Date(currentTime)); + String datasourceType = TemplateDataSourceTypeEnum.getMessage(taskDataSource.getDatasourceType()); + AbnormalDataRecordInfo abnormalDataRecordInfoExists = abnormalDataRecordInfoDao.findByPrimary(taskRuleSimple.getRuleId(), dbName, tableName, nowDate); + String standardRuleName = "DQM-" + taskRuleSimple.getProjectName() + "-" + taskRuleSimple.getRuleName() + "-" + taskRuleSimple.getTemplateName(); + String standardRuleDetail = taskRuleSimple.getProjectName() + "-" + taskRuleSimple.getRuleName() + "-" + taskRuleSimple.getTemplateName(); + if (abnormalDataRecordInfoExists == null) { + AbnormalDataRecordInfo abnormalDataRecordInfo = new AbnormalDataRecordInfo(taskRuleSimple.getRuleId(), standardRuleName, datasourceType + , dbName, tableName, departmentName, subSystemId, execNum, alarmNum); + abnormalDataRecordInfo.setRuleDetail(StringUtils.isEmpty(taskRuleSimple.getRuleDetail()) ? standardRuleDetail : taskRuleSimple.getRuleDetail()); + abnormalDataRecordInfo.setRecordDate(nowDate); + abnormalDataRecordInfo.setRecordTime(QualitisConstants.PRINT_TIME_FORMAT.format(currentTime)); + abnormalDataRecordInfoList.add(abnormalDataRecordInfoDao.save(abnormalDataRecordInfo)); + } else { + abnormalDataRecordInfoExists.setRuleName(standardRuleName); + abnormalDataRecordInfoExists.setRuleDetail(StringUtils.isEmpty(taskRuleSimple.getRuleDetail()) ? standardRuleDetail : taskRuleSimple.getRuleDetail()); + abnormalDataRecordInfoExists.setRecordTime(QualitisConstants.PRINT_TIME_FORMAT.format(currentTime)); + abnormalDataRecordInfoExists.setExecuteNum(abnormalDataRecordInfoExists.getExecuteNum() + execNum); + abnormalDataRecordInfoExists.setEventNum(abnormalDataRecordInfoExists.getEventNum() + alarmNum); + abnormalDataRecordInfoExists.setDepartmentName(departmentName); + abnormalDataRecordInfoExists.setDatasource(datasourceType); + abnormalDataRecordInfoExists.setSubSystemId(subSystemId); + abnormalDataRecordInfoList.add(abnormalDataRecordInfoDao.save(abnormalDataRecordInfoExists)); + } + } + } + + @Override + @Transactional(propagation = Propagation.REQUIRED, rollbackFor = {RuntimeException.class, UnExpectedRequestException.class}) + public void abnormalDataRecordAlarm() { + // Check if upload with another qualitis. + try { + InetAddress inetAddress = InetAddress.getLocalHost(); + LOGGER.info("Start to find upload success record today. Ip:" + inetAddress.getHostAddress()); + } catch (UnknownHostException e) { + LOGGER.error("Failed to get host info and record log."); + } + + Date nowDate = new Date(); + UploadRecord uploadRecord = uploadRecordDao.findByUnique(new Date(nowDate.getTime()), true); + if (uploadRecord != null) { + LOGGER.info("Upload record today successfully."); + return; + } else { + LOGGER.info("Upload record today is still incomplete."); + } + + LOGGER.info("Start to find yesterday's abnormal data record with exists rules and alarm."); + Calendar calendar = Calendar.getInstance(); + calendar.setTime(nowDate); + calendar.add(Calendar.DAY_OF_MONTH, -1); + + List abnormalDataRecordInfoList = abnormalDataRecordInfoDao.findWithExistRulesByRecordDate(QualitisConstants.PRINT_DATE_FORMAT.format(new Date(calendar.getTime().getTime()))); + if (CollectionUtils.isEmpty(abnormalDataRecordInfoList)) { + LOGGER.info("No abnormal data record to alarm."); + return; + } + UploadRecord currentUploadRecord = new UploadRecord(abnormalDataRecordInfoList.size(), true, new Date(nowDate.getTime()) + , QualitisConstants.PRINT_TIME_FORMAT.format(nowDate), ""); + int stage = abnormalDataRecordInfoList.size() / BATCH_ABNORMAL_DATA_RECORD + 1; + try { + for (int index = 0; index < stage; index++) { + // Report in batch. + List currentAbnormalDataRecordInfoList; + if (index * BATCH_ABNORMAL_DATA_RECORD + BATCH_ABNORMAL_DATA_RECORD < abnormalDataRecordInfoList.size()) { + currentAbnormalDataRecordInfoList = abnormalDataRecordInfoList.subList(index * BATCH_ABNORMAL_DATA_RECORD, index * BATCH_ABNORMAL_DATA_RECORD + BATCH_ABNORMAL_DATA_RECORD); + + } else { + currentAbnormalDataRecordInfoList = abnormalDataRecordInfoList.subList(index * BATCH_ABNORMAL_DATA_RECORD, abnormalDataRecordInfoList.size()); + } + + List> data = new ArrayList<>(currentAbnormalDataRecordInfoList.size()); + for (AbnormalDataRecordInfo abnormalDataRecordInfo : currentAbnormalDataRecordInfoList) { + Map map = new HashMap<>(16); + map.put("ruleId", abnormalDataRecordInfo.getRuleId()); + map.put("ruleName", abnormalDataRecordInfo.getRuleName()); + map.put("ruleDetail", abnormalDataRecordInfo.getRuleDetail()); + map.put("dataSource", abnormalDataRecordInfo.getDatasource().toUpperCase()); + map.put("dbName", abnormalDataRecordInfo.getDbName()); + map.put("tableName", abnormalDataRecordInfo.getTableName()); + map.put("dept", abnormalDataRecordInfo.getDepartmentName()); + map.put("subsystemId", abnormalDataRecordInfo.getSubSystemId()); + map.put("executeNum", abnormalDataRecordInfo.getExecuteNum()); + map.put("eventNum", abnormalDataRecordInfo.getEventNum()); + map.put("reportSource", "Qualitis"); + data.add(map); + } + // Alarm client. + AlarmUtil.sendAbnormalDataRecordAlarm(imsConfig, alarmClient, data); + } + } catch (Exception e) { + LOGGER.error("Failed to send abnormal data record ims. Exception: {}", e.getMessage(), e); + currentUploadRecord.setStatus(false); + currentUploadRecord.setErrMsg(e.getMessage()); + } + + uploadRecordDao.save(currentUploadRecord); + } + + private void checkIfReport(Application application, ImsConfig imsConfig) { + List tasks = taskDao.findByApplication(application); + List reportBatchInfos; + try { + LOGGER.info("Start to collect task result and to construct report metric data."); + reportBatchInfos = ReportUtil.collectTaskResult(tasks, taskResultDao, taskDataSourceDao, ruleMetricDao, imsConfig); + LOGGER.info("Success to collect task result and to construct report metric data."); + + // Construct report content and report. + for (ReportBatchInfo reportBatchInfo : reportBatchInfos) { + ReportUtil.reportTaskResult(reportBatchInfo, alarmClient); + } + } catch (Exception e) { + LOGGER.error("Report to IMS failed in batch call!"); + LOGGER.error(e.getMessage(), e); + } + + } + + /** + * 查询规则配置是否匹配去噪管理参数 + * 1只告警不阻断、2不告警不阻断 + * 阻断是将未通过校验变成失败 + *

+ * 去噪参数匹配---->> + * 1.run_date; 页面输入时间格式:20221219(例子) + * 2.模板id符合; + * 3.校验条件符合; + * + * @param rule 规则 + * @param executionParameters 执行参数模板对象 + * @return + */ + private Integer checkEligible(Rule rule, ExecutionParameters executionParameters, TaskRuleSimple taskRuleSimple) { + // 获取run_date + List taskResultList = taskResultDao.findByApplicationAndRule(taskRuleSimple.getApplicationId(), rule.getId()); + // 时间戳格式 1661616000000 + Set collect = taskResultList.stream().map(item -> item.getRunDate() != null ? item.getRunDate() : null).collect(Collectors.toSet()); + + LOGGER.info("Execution Variable run_date : {}", CollectionUtils.isNotEmpty(collect) ? collect.iterator().next() : null); + + if (executionParameters.getWhetherNoise() != null && Boolean.TRUE.equals(executionParameters.getWhetherNoise()) && CollectionUtils.isNotEmpty(executionParameters.getNoiseEliminationManagement())) { + + for (NoiseEliminationManagement noiseEliminationManagement : executionParameters.getNoiseEliminationManagement()) { + + if (Boolean.FALSE.equals(noiseEliminationManagement.getAvailable())) { + continue; + } + + Set dateCollectResponses = Sets.newHashSet(); + if (StringUtils.isNotBlank(noiseEliminationManagement.getBusinessDate())) { + String[] businessDate = noiseEliminationManagement.getBusinessDate().split(","); + for (String date : businessDate) { + dateCollectResponses.add(Long.parseLong(date)); + } + } + // 业务时间范围 1661616000000 + boolean flag = dateCollectResponses.stream().anyMatch(new HashSet<>(collect)::contains); + + LOGGER.info("Noise removal parameters template ID : {}", noiseEliminationManagement.getTemplateId() != null ? noiseEliminationManagement.getTemplateId() : null); + // 模板ID + boolean templateIdOrNo = noiseEliminationManagement.getTemplateId().toString().equals(rule.getTemplate().getId().toString()); + + List collectList = CustomObjectMapper.transJsonToObjects(noiseEliminationManagement.getNoiseNormRatio(), CheckConditionsResponse.class); + // 因CompareType可能为空,所以这里默认给一个CheckTemplateEnum 不存在的值0 + if (CollectionUtils.isNotEmpty(collectList)) { + for (CheckConditionsResponse item : collectList) { + item.setOutputMetaName(null); + item.setCompareType(item.getCompareType() != null ? item.getCompareType() : 0); + } + + } + List noiseLists = rule.getAlarmConfigs().stream().map(temp -> { + CheckConditionsResponse gather = new CheckConditionsResponse(); + gather.setOutputMetaId(temp.getTemplateOutputMeta().getId()); + gather.setCheckTemplate(temp.getCheckTemplate()); + gather.setCompareType(temp.getCompareType() != null ? temp.getCompareType() : 0); + gather.setThreshold(temp.getThreshold()); + return gather; + }).collect(Collectors.toList()); + + // 校验条件 + boolean condition = false; + for (CheckConditionsResponse check : collectList) { + + CheckConditionsResponse response = noiseLists.stream().filter(temp -> + check.getOutputMetaId().toString().equals(temp.getOutputMetaId().toString()) && + check.getCheckTemplate().toString().equals(temp.getCheckTemplate().toString()) && + check.getCompareType().toString().equals(temp.getCompareType().toString()) && + check.getThreshold().toString().equals(temp.getThreshold().toString()) + ).findAny().orElse(null); + if (response != null) { + condition = true; + } + + } + + LOGGER.info("Judge whether the noise removal parameters are met,verification conditions : {}, business time: {}, template id: {}, Enable or not: {}", + condition, flag, templateIdOrNo, noiseEliminationManagement.getAvailable()); + if (Boolean.TRUE.equals(condition) && Boolean.TRUE.equals(flag) && Boolean.TRUE.equals(templateIdOrNo) && Boolean.TRUE.equals(noiseEliminationManagement.getAvailable())) { + //满足去噪参数 将结果表的噪声属性设置为true,命中去噪范围表示是一个噪声值,不参与正常的趋势变化校验 + if (CollectionUtils.isNotEmpty(taskResultList)) { + //遍历 set 去噪属性为 true + taskResultList.stream().forEach(e -> e.setDenoisingValue(true)); + for (TaskResult taskResult : taskResultList) { + taskResultDao.saveTaskResult(taskResult); + } + } + LOGGER.info("Noise reduction strategy : {}", noiseEliminationManagement.getEliminateStrategy() != null ? NoiseStrategyEnum.getNoiseStrategyMessage(noiseEliminationManagement.getEliminateStrategy()) : null); + return noiseEliminationManagement.getEliminateStrategy(); + } + + } + + } + return null; + } + + private Boolean checkWhetherBlocked(Task task) { + Set taskRuleSimpleCollect = task.getTaskRuleSimples(); + for (TaskRuleSimple taskRuleSimple : taskRuleSimpleCollect) { + Rule rule = ruleDao.findById(taskRuleSimple.getRuleId()); + if (StringUtils.isNotBlank(rule.getExecutionParametersName())) { + ExecutionParameters executionParameters = executionParametersDao + .findByNameAndProjectId(rule.getExecutionParametersName(), rule.getProject().getId()); + if (executionParameters != null) { + Integer strategy = checkEligible(rule, executionParameters, taskRuleSimple); + if (strategy != null) { + return true; + } + + } + } + } + return false; + } + } diff --git a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/UpdaterRunnable.java b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/UpdaterRunnable.java index 13c4ab98..8b0b2380 100644 --- a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/UpdaterRunnable.java +++ b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/UpdaterRunnable.java @@ -17,13 +17,10 @@ package com.webank.wedatasphere.qualitis.timer; import com.webank.wedatasphere.qualitis.bean.JobChecker; -import com.webank.wedatasphere.qualitis.dao.ApplicationDao; -import com.webank.wedatasphere.qualitis.dao.TaskDao; -import com.webank.wedatasphere.qualitis.ha.AbstractServiceCoordinator; import java.util.List; -import java.util.concurrent.CountDownLatch; import org.slf4j.Logger; import org.slf4j.LoggerFactory; +import java.util.concurrent.CountDownLatch; /** * @author allenzhou diff --git a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/UpdaterThreadFactory.java b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/UpdaterThreadFactory.java index 2764749d..07c98d63 100644 --- a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/UpdaterThreadFactory.java +++ b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/UpdaterThreadFactory.java @@ -16,7 +16,6 @@ package com.webank.wedatasphere.qualitis.timer; -import java.util.Random; import java.util.concurrent.ThreadFactory; /** @@ -25,6 +24,6 @@ public class UpdaterThreadFactory implements ThreadFactory { @Override public Thread newThread(Runnable r) { - return new Thread(r); + return new Thread(r, "Application Update Thread"); } } diff --git a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/UploaderRunnable.java b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/UploaderRunnable.java new file mode 100644 index 00000000..33f8dd3f --- /dev/null +++ b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/timer/UploaderRunnable.java @@ -0,0 +1,51 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.timer; + +import com.webank.wedatasphere.qualitis.ha.AbstractServiceCoordinator; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +/** + * @author allenzhou + */ +public class UploaderRunnable implements Runnable { + private IChecker iChecker; + private AbstractServiceCoordinator abstractServiceCoordinator; + + private static final Logger LOGGER = LoggerFactory.getLogger(UploaderRunnable.class); + + + public UploaderRunnable(IChecker iChecker, AbstractServiceCoordinator abstractServiceCoordinator) { + this.abstractServiceCoordinator = abstractServiceCoordinator; + this.iChecker = iChecker; + } + + @Override + public void run() { + try { + LOGGER.info(Thread.currentThread().getName() + " start to upload rules."); + abstractServiceCoordinator.coordinate(); + iChecker.abnormalDataRecordAlarm(); + } catch (Exception e) { + LOGGER.error("Failed to upload rules, caused by: {}", e.getMessage(), e); + } finally { + abstractServiceCoordinator.release(); + } + } + +} diff --git a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/util/AlarmUtil.java b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/util/AlarmUtil.java new file mode 100644 index 00000000..d3cf0d19 --- /dev/null +++ b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/util/AlarmUtil.java @@ -0,0 +1,699 @@ +package com.webank.wedatasphere.qualitis.util; + +import com.webank.wedatasphere.qualitis.checkalert.entity.CheckAlert; +import com.webank.wedatasphere.qualitis.client.AlarmClient; +import com.webank.wedatasphere.qualitis.config.ImsConfig; +import com.webank.wedatasphere.qualitis.constant.AlarmConfigStatusEnum; +import com.webank.wedatasphere.qualitis.constant.AlertTypeEnum; +import com.webank.wedatasphere.qualitis.constant.ApplicationStatusEnum; +import com.webank.wedatasphere.qualitis.constant.ImsLevelEnum; +import com.webank.wedatasphere.qualitis.constant.SpecCharEnum; +import com.webank.wedatasphere.qualitis.constants.QualitisConstants; +import com.webank.wedatasphere.qualitis.dao.AlarmInfoDao; +import com.webank.wedatasphere.qualitis.dao.ApplicationCommentDao; +import com.webank.wedatasphere.qualitis.dao.RoleDao; +import com.webank.wedatasphere.qualitis.dao.TaskResultStatusDao; +import com.webank.wedatasphere.qualitis.dao.UserDao; +import com.webank.wedatasphere.qualitis.dao.UserRoleDao; +import com.webank.wedatasphere.qualitis.entity.AlarmInfo; +import com.webank.wedatasphere.qualitis.entity.Application; +import com.webank.wedatasphere.qualitis.entity.ApplicationComment; +import com.webank.wedatasphere.qualitis.entity.Role; +import com.webank.wedatasphere.qualitis.entity.RuleMetric; +import com.webank.wedatasphere.qualitis.entity.Task; +import com.webank.wedatasphere.qualitis.entity.TaskDataSource; +import com.webank.wedatasphere.qualitis.entity.TaskResult; +import com.webank.wedatasphere.qualitis.entity.TaskResultStatus; +import com.webank.wedatasphere.qualitis.entity.TaskRuleAlarmConfig; +import com.webank.wedatasphere.qualitis.entity.TaskRuleSimple; +import com.webank.wedatasphere.qualitis.entity.User; +import com.webank.wedatasphere.qualitis.entity.UserRole; +import com.webank.wedatasphere.qualitis.rule.constant.CheckTemplateEnum; +import com.webank.wedatasphere.qualitis.rule.constant.CompareTypeEnum; +import com.webank.wedatasphere.qualitis.rule.dao.ExecutionParametersDao; +import com.webank.wedatasphere.qualitis.rule.entity.AlarmArgumentsExecutionParameters; +import com.webank.wedatasphere.qualitis.rule.entity.ExecutionParameters; +import com.webank.wedatasphere.qualitis.rule.entity.Rule; +import org.apache.commons.collections.CollectionUtils; +import org.apache.commons.lang3.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.util.ArrayList; +import java.util.Arrays; +import java.util.Collections; +import java.util.HashMap; +import java.util.HashSet; +import java.util.List; +import java.util.Map; +import java.util.Set; +import java.util.stream.Collectors; + +/** + * @author allenzhou + */ +public class AlarmUtil { + + private static final Logger LOGGER = LoggerFactory.getLogger("monitor"); + + private static final String ADMIN = "ADMIN"; + + private AlarmUtil() { + + } + + public static List notSafeAlarmConfig(List tasks) { + List taskRuleAlarmConfigs = new ArrayList<>(); + for (Task task : tasks) { + for (TaskRuleSimple taskRuleSimple : task.getTaskRuleSimples()) { + if (taskRuleSimple.getAlertLevel() == null) { + continue; + } + for (TaskRuleAlarmConfig taskRuleAlarmConfig : taskRuleSimple.getTaskRuleAlarmConfigList()) { + if (taskRuleAlarmConfig.getStatus().equals(AlarmConfigStatusEnum.NOT_PASS.getCode())) { + taskRuleAlarmConfigs.add(taskRuleAlarmConfig); + } + } + } + } + return taskRuleAlarmConfigs; + } + + public static List notSafeTaskRuleSimple(List tasks) { + List taskRuleSimples = new ArrayList<>(); + for (Task task : tasks) { + for (TaskRuleSimple taskRuleSimple : task.getTaskRuleSimples()) { + List taskRuleAlarmConfigList = taskRuleSimple.getTaskRuleAlarmConfigList(); + + for (TaskRuleAlarmConfig taskRuleAlarmConfig : taskRuleAlarmConfigList) { + if (taskRuleAlarmConfig.getStatus().equals(AlarmConfigStatusEnum.NOT_PASS.getCode())) { + taskRuleSimples.add(taskRuleSimple); + break; + } + } + } + } + return taskRuleSimples; + } + + public static void sendAlarmMessage(Application application, List checkFailedRules, ImsConfig imsConfig, AlarmClient client + , AlarmInfoDao alarmInfoDao, UserDao userDao, TaskResultStatusDao taskResultStatusDao, Integer alert, String alertReceiver, Boolean flag) { + boolean bdap = imsConfig.getTitlePrefix().contains("BDAP"); + // 获取告警内容 + StringBuilder alertInfo = new StringBuilder(); + List> requestList = new ArrayList<>(checkFailedRules.size()); + // 遍历每一个规则 + for (TaskRuleSimple taskRuleSimpleTemp : checkFailedRules) { + Map request = new HashMap<>(6); + // 获取告警标题 + String cnName = taskRuleSimpleTemp.getCnName(); + String enName = taskRuleSimpleTemp.getRuleName(); + String realRuleName = StringUtils.isNotEmpty(cnName) ? cnName : enName; + + String enProjectName = taskRuleSimpleTemp.getProjectName(); + String cnProjectName = taskRuleSimpleTemp.getProjectCnName(); + String realProjectName = StringUtils.isNotEmpty(cnProjectName) ? cnProjectName : enProjectName; + + String alertTitle = ""; + alertInfo.append(getDepartAlerters(taskRuleSimpleTemp, userDao, AlertTypeEnum.TASK_FAIL_CHECKOUT.getCode(), StringUtils.isNotBlank(alertReceiver) ? alertReceiver : taskRuleSimpleTemp.getAlertReceiver())).append("\n"); + if (flag) { + alertTitle = imsConfig.getNewTitleSucceedPrefix() + "【" + realRuleName + "】"; + alertInfo.append("Qualitis项目: ").append(realProjectName).append(", 技术规则: ").append(realRuleName).append(" 任务校验通过.\n"); + } else { + alertTitle = imsConfig.getNewTitlePrefix() + "【" + realRuleName + "】"; + alertInfo.append("Qualitis项目: ").append(realProjectName).append(", 技术规则: ").append(realRuleName).append(" 任务校验不通过.\n"); + } + alertInfo.append("库表信息:").append(retrieveDatasource(taskRuleSimpleTemp)).append("\n"); + + List taskRuleAlarmConfigList = taskRuleSimpleTemp.getTaskRuleAlarmConfigList(); + Map taskRuleAlarmConfigMap = taskRuleAlarmConfigList.stream().collect(Collectors.toMap(TaskRuleAlarmConfig::getId, t -> t, (oValue, nValue) -> nValue)); + List taskResultStatusList = taskResultStatusDao.findByStatus(application.getId(), taskRuleSimpleTemp.getRuleId(), AlarmConfigStatusEnum.NOT_PASS.getCode()); + for (TaskResultStatus taskResultStatus : taskResultStatusList) { + TaskRuleAlarmConfig alarmConfig = taskRuleAlarmConfigMap.get(taskResultStatus.getTaskRuleAlarmConfigId()); + if (null == alarmConfig) { + continue; + } + TaskResult taskResult = taskResultStatus.getTaskResult(); + String value = StringUtils.isBlank(taskResult.getValue()) ? "empty value" : taskResult.getValue(); + String compareValue = StringUtils.isBlank(taskResult.getCompareValue()) ? "empty value" : taskResult.getCompareValue(); + if (alarmConfig.getRuleMetric() == null || alarmConfig.getRuleMetric().getId().equals(taskResult.getRuleMetricId())) { + alarmStringAppend(alertInfo, alarmConfig, value, compareValue, realRuleName, realProjectName, taskResult.getEnvName()); + } + } + + alertInfo.append("\n也可进入 Qualitis 系统查看详情。"); + List ruleMetrics = taskRuleSimpleTemp.getTaskRuleAlarmConfigList().stream().map(TaskRuleAlarmConfig::getRuleMetric) + .filter(ruleMetric -> ruleMetric != null).collect(Collectors.toList()); + // 获取告警规则关联子系统 + int subSystemId = QualitisConstants.SUB_SYSTEM_ID; + if (CollectionUtils.isEmpty(ruleMetrics)) { + LOGGER.info("Qualitis find project's subsystem ID or datasource's subsystem ID because there is no rule metric. Rule name: " + realRuleName); + if (null != application.getSubSystemId()) { + subSystemId = application.getSubSystemId().intValue(); + } + if (taskRuleSimpleTemp.getTask() != null && CollectionUtils.isNotEmpty(taskRuleSimpleTemp.getTask().getTaskDataSources())) { + List subSystemIds = taskRuleSimpleTemp.getTask().getTaskDataSources().stream() + .map(taskDataSource -> taskDataSource.getSubSystemId()).filter(ele -> ele != null).collect( + Collectors.toList()); + if (CollectionUtils.isNotEmpty(subSystemIds)) { + Long currentSubSystemId = subSystemIds.iterator().next(); + if (currentSubSystemId != null) { + subSystemId = currentSubSystemId.intValue(); + } + } + } + } else { + // 获取子系统 + if (ruleMetrics.iterator().next().getSubSystemId() != null) { + subSystemId = ruleMetrics.iterator().next().getSubSystemId(); + } + } + // 获取告警级别 + int alertLevel = alert != null ? alert.intValue() : taskRuleSimpleTemp.getAlertLevel().intValue(); + // 获取告警人 + List receivers = getReceivers(taskRuleSimpleTemp, AlertTypeEnum.TASK_FAIL_CHECKOUT.getCode(), StringUtils.isNotBlank(alertReceiver) ? alertReceiver : taskRuleSimpleTemp.getAlertReceiver()); + // Alert object: database1[table1,table2];database2[table3,table4] + String alertObj = contructAlertObj(taskRuleSimpleTemp.getTask().getTaskDataSources()); + // 封装告警 + packageAlarm(alertInfo, request, alertTitle, subSystemId, alertLevel, receivers, alertObj, imsConfig); + requestList.add(request); + + // 保存alarm_info表 + for (String username : receivers) { + AlarmInfo alarmInfo = new AlarmInfo(getAlertLevel(alertLevel), alertInfo.toString(), application.getId(), application.getSubmitTime(), + application.getFinishTime(), application.getFinishTime(), username, AlertTypeEnum.TASK_FAIL_CHECKOUT.getCode(), realProjectName); + alarmInfoDao.save(alarmInfo); + } + + if (bdap) { + client.sendAlarm(StringUtils.join(receivers, ","), imsConfig.getTitlePrefix() + "集群 Qualitis 任务告警\n" + , alertInfo.toString(), String.valueOf(alertLevel)); + } + + alertInfo.delete(0, alertInfo.length()); + } + if (CollectionUtils.isNotEmpty(requestList) && !bdap) { + client.sendNewAlarm(requestList); + } + } + + private static String retrieveDatasource(TaskRuleSimple taskRuleSimpleTemp) { + if (CollectionUtils.isNotEmpty(taskRuleSimpleTemp.getTask().getTaskDataSources())) { + String dbAndTable = taskRuleSimpleTemp.getTask().getTaskDataSources().stream().filter(taskDataSource -> taskDataSource.getRuleId().equals(taskRuleSimpleTemp.getRuleId())).map(taskDataSource -> + (StringUtils.isNotEmpty(taskDataSource.getDatabaseName()) ? taskDataSource.getDatabaseName() : "") + SpecCharEnum.PERIOD_NO_ESCAPE.getValue() + + (StringUtils.isNotEmpty(taskDataSource.getTableName()) ? taskDataSource.getTableName() : "") + SpecCharEnum.PERIOD_NO_ESCAPE.getValue() + + (StringUtils.isNotEmpty(taskDataSource.getColName()) ? taskDataSource.getColName() : "[]")) + .collect(Collectors.joining(SpecCharEnum.DIVIDER.getValue())); + return dbAndTable; + } + return ""; + } + + private static void packageAlarm(StringBuilder alertInfo, Map request, String alertTitle, int subSystemId, int alertLevel, + List receivers, String alertObj, ImsConfig imsConfig) { + request.put("alert_reciver", StringUtils.join(receivers, ",")); + request.put("alert_info", alertInfo.toString()); + request.put("sub_system_id", subSystemId); + request.put("alert_title", alertTitle); + request.put("alert_level", alertLevel); + request.put("alert_obj", alertObj); + // RTX, EMAIL + request.put("alert_way", imsConfig.getAlertWay()); + request.put("ci_type_id", 55); + } + + /** + * 告警内容添加科室通知人 + * + * @param taskRuleSimple taskRuleSimple + * @param userDao userDao + * @param alarmCode + * @return 请通知XX + */ + public static String getDepartAlerters(TaskRuleSimple taskRuleSimple, UserDao userDao, Integer alarmCode, String alertReceiver) { + User creator = userDao.findByUsername(taskRuleSimple.getProjectCreator()); + StringBuilder alerters = new StringBuilder("请通知:"); + String creatorName = creator.getUsername(); + alerters.append("告警接收人").append(alertReceiver); + Set departAlerters = new HashSet<>(); + + if (alarmCode.equals(AlertTypeEnum.TASK_FAILED.getCode())) { + Role role = SpringContextHolder.getBean(RoleDao.class).findByRoleName(ADMIN); + Set admins = SpringContextHolder.getBean(UserRoleDao.class).findByRole(role).stream().map(UserRole::getUser).map(User::getUsername).collect(Collectors.toSet()); + departAlerters.addAll(admins); + departAlerters.remove(creatorName); + + if (departAlerters.isEmpty()) { + return alerters.toString(); + } + // 增加失败人 + alerters.append(",大数据平台室").append(StringUtils.join(departAlerters, ",")); + return alerters.toString(); + } else if (alarmCode.equals(AlertTypeEnum.TASK_FAIL_CHECKOUT.getCode())) { + return alerters.toString(); + } + + return alerters.toString(); + } + + /** + * 告警内容添加初始化失败接收人信息 + * + * @param rules + * @return 请通知XX + */ + public static String getDepartAlerters(List rules) { + StringBuilder alerters = new StringBuilder("请通知:"); + alerters.append("告警接收人"); + for (Rule rule : rules) { + if (StringUtils.isNotBlank(rule.getExecutionParametersName())) { + ExecutionParameters executionParameters = SpringContextHolder.getBean(ExecutionParametersDao.class).findByNameAndProjectId(rule.getExecutionParametersName(), rule.getProject().getId()); + if (executionParameters != null) { + //兼容旧规则数据 + if (StringUtils.isNotBlank(executionParameters.getAlertReceiver())) { + if (!alerters.toString().contains(executionParameters.getAlertReceiver())) { + alerters.append( + StringUtils.isNotEmpty(executionParameters.getAlertReceiver()) ? executionParameters.getAlertReceiver() : "未设置") + .append("或创建用户") + .append(rule.getCreateUser()) + .append(";"); + } + continue; + } + //0.23.0版本 + if (CollectionUtils.isNotEmpty(executionParameters.getAlarmArgumentsExecutionParameters())) { + for (AlarmArgumentsExecutionParameters parameters : executionParameters.getAlarmArgumentsExecutionParameters()) { + if (!alerters.toString().contains(parameters.getAlarmReceiver())) { + alerters.append(StringUtils.isNotEmpty(parameters.getAlarmReceiver()) ? parameters.getAlarmReceiver() : "未设置") + .append("或创建用户") + .append(rule.getCreateUser()) + .append(";"); + } + } + } + + } + } else { + alerters.append(StringUtils.isNotEmpty(rule.getAlertReceiver()) ? rule.getAlertReceiver() : "未设置") + .append("或创建用户") + .append(rule.getCreateUser()) + .append(";"); + } + } + alerters.append("\n"); + return alerters.toString(); + } + + /** + * Qualitis 项目xxx 技术规则xxx 任务运行完成,不符合数据质量要求。原因:任务运行结果: [5], 超出设定阈值: [4], 比较模版: [月波动], 比较方式: [大于] + * + * @param alertInfo + * @param alarmConfig + */ + private static void alarmStringAppend(StringBuilder alertInfo, TaskRuleAlarmConfig alarmConfig, String value, String compareValue, String ruleName, String projectName, String envName) { + Integer checkTemplate = alarmConfig.getCheckTemplate(); + String checkTemplateName = CheckTemplateEnum.getCheckTemplateName(checkTemplate); + alertInfo.append("Qualitis项目: ").append(projectName). + append(" 技术规则: ").append(ruleName) + .append(" 任务运行完成, 不符合数据质量要求。原因: ") + .append(alarmConfig.getOutputName() + " - [").append(StringUtils.isEmpty(value) ? "" : value) + .append(alarmConfig.getOutputUnit() == null ? "" : alarmConfig.getOutputUnit()).append("]") + .append(", 不符合设定阈值: [").append(alarmConfig.getThreshold()).append(alarmConfig.getOutputUnit() == null ? "" : alarmConfig.getOutputUnit()) + .append("]") + .append(", 比较模版: [").append(checkTemplateName).append("]"); + if (checkTemplate.equals(CheckTemplateEnum.FIXED_VALUE.getCode())) { + Integer compareType = alarmConfig.getCompareType(); + String compareTypeName = CompareTypeEnum.getCompareTypeName(compareType); + alertInfo.append(", 比较方式: [").append(compareTypeName).append("]"); + } else if (checkTemplate.equals(CheckTemplateEnum.FULL_YEAR_RING_GROWTH.getCode()) + || checkTemplate.equals(CheckTemplateEnum.HALF_YEAR_GROWTH.getCode()) + || checkTemplate.equals(CheckTemplateEnum.SEASON_RING_GROWTH.getCode()) + || checkTemplate.equals(CheckTemplateEnum.MONTH_RING_GROWTH.getCode()) + || checkTemplate.equals(CheckTemplateEnum.WEEK_RING_GROWTH.getCode()) + || checkTemplate.equals(CheckTemplateEnum.HOUR_RING_GROWTH.getCode()) + || checkTemplate.equals(CheckTemplateEnum.DAY_RING_GROWTH.getCode())) { + alertInfo.append(", 计算过程:(本期 - 上期) / 上期:") + .append(compareValue); + } else if (checkTemplate.equals(CheckTemplateEnum.DAY_FLUCTUATION.getCode()) + || checkTemplate.equals(CheckTemplateEnum.MONTH_FLUCTUATION.getCode()) + || checkTemplate.equals(CheckTemplateEnum.WEEK_FLUCTUATION.getCode())) { + alertInfo.append(", 过去的平均值:").append(compareValue); + } + if (StringUtils.isNotBlank(envName)) { + alertInfo.append("。环境名称: [").append(envName).append("]"); + } + alertInfo.append("\n"); + } + + public static List getReceivers(TaskRuleSimple taskRuleSimple, Integer alarmCode, String alertReceiver) { + List users = new ArrayList<>(); + // 增加规则关注人 + if (StringUtils.isNotBlank(alertReceiver)) { + Collections.addAll(users, alertReceiver.split(",")); + } + if (!users.contains(taskRuleSimple.getProjectCreator())) { + // 增加创建者 + users.add(taskRuleSimple.getProjectCreator()); + } + if (alarmCode.equals(AlertTypeEnum.TASK_FAILED.getCode()) || alarmCode.equals(AlertTypeEnum.TASK_FAIL_CHECKOUT.getCode())) { + // Automatically grant the highest authority to the system administrator. + // 增加超出阈值告警人 + Role role = SpringContextHolder.getBean(RoleDao.class).findByRoleName(ADMIN); + Set admins = SpringContextHolder.getBean(UserRoleDao.class).findByRole(role).stream().map(UserRole::getUser).map(User::getUsername).collect(Collectors.toSet()); + for (String user : admins) { + if (!users.contains(user)) { + users.add(user); + } + } + } + + return users; + } + + private static String getAlertLevel(Integer alert) { + String alertLevel; + if (alert.toString().equals(ImsLevelEnum.CRITICAL.getCode())) { + alertLevel = ImsLevelEnum.CRITICAL.getCode(); + } else if (alert.toString().equals(ImsLevelEnum.MAJOR.getCode())) { + alertLevel = ImsLevelEnum.MAJOR.getCode(); + } else if (alert.toString().equals(ImsLevelEnum.MINOR.getCode())) { + alertLevel = ImsLevelEnum.MINOR.getCode(); + } else if (alert.toString().equals(ImsLevelEnum.WARNING.getCode())) { + alertLevel = ImsLevelEnum.WARNING.getCode(); + } else { + alertLevel = ImsLevelEnum.INFO.getCode(); + } + return alertLevel; + } + + public static Map> rulePartitionByProject(List taskRuleSimples) { + Map> result = new HashMap<>(2); + for (TaskRuleSimple taskRuleSimple : taskRuleSimples) { + if (!result.containsKey(taskRuleSimple.getProjectId())) { + List tmp = new ArrayList<>(); + tmp.add(taskRuleSimple); + result.put(taskRuleSimple.getProjectId(), tmp); + } else { + result.get(taskRuleSimple.getProjectId()).add(taskRuleSimple); + } + } + + return result; + } + + public static Map> alarmConfigPartitionByProject(List alarmConfigs) { + Map> result = new HashMap<>(2); + for (TaskRuleAlarmConfig alarmConfig : alarmConfigs) { + Long projectId = alarmConfig.getTaskRuleSimple().getProjectId(); + + if (!result.containsKey(projectId)) { + List tmp = new ArrayList<>(); + tmp.add(alarmConfig); + result.put(projectId, tmp); + } else { + result.get(projectId).add(alarmConfig); + } + } + return result; + } + + public static void sendFailedMessage(Application application, List failedRules, ImsConfig imsConfig, AlarmClient client + , AlarmInfoDao alarmInfoDao, UserDao userDao, Integer alertRank, String alertReceiver) { + boolean bdap = imsConfig.getTitlePrefix().contains("BDAP"); + // 获取告警内容 + StringBuilder alertInfo = new StringBuilder(); + List> requestList = new ArrayList<>(failedRules.size()); + + for (TaskRuleSimple taskRuleSimpleTemp : failedRules) { + Map request = new HashMap<>(6); + // 获取告警标题 + String cnName = taskRuleSimpleTemp.getCnName(); + String enName = taskRuleSimpleTemp.getRuleName(); + String realRuleName = StringUtils.isNotEmpty(cnName) ? cnName : enName; + + String enProjectName = taskRuleSimpleTemp.getProjectName(); + String cnProjectName = taskRuleSimpleTemp.getProjectCnName(); + String realPorjectName = StringUtils.isNotEmpty(cnProjectName) ? cnProjectName : enProjectName; + String alertTitle = imsConfig.getNewTitlePrefix() + "【" + realRuleName + "】"; + alertInfo.append(getDepartAlerters(taskRuleSimpleTemp, userDao, AlertTypeEnum.TASK_FAILED.getCode(), StringUtils.isNotBlank(alertReceiver) ? alertReceiver : taskRuleSimpleTemp.getAlertReceiver())).append("\n"); + alertInfo.append("Qualitis项目: ").append(realPorjectName).append(", 技术规则: ").append(realRuleName).append(" 任务执行失败.\n"); + alertInfo.append("任务编号: ").append(application.getId()).append(". "); + if (null != application.getApplicationComment()) { + ApplicationComment applicationComment = SpringContextHolder.getBean(ApplicationCommentDao.class).getByCode(application.getApplicationComment()); + alertInfo.append("任务备注: ").append(applicationComment != null ? applicationComment.getZhMessage() : null).append(". "); + } + alertInfo.append("\n"); + alertInfo.append("\n也可进入 Qualitis 系统查看详情。"); + List ruleMetrics = taskRuleSimpleTemp.getTaskRuleAlarmConfigList().stream().map(TaskRuleAlarmConfig::getRuleMetric) + .filter(ruleMetric -> ruleMetric != null).collect(Collectors.toList()); + int subSystemId = QualitisConstants.SUB_SYSTEM_ID; + if (CollectionUtils.isEmpty(ruleMetrics)) { + LOGGER.info("Qualitis find project's subsystem ID or datasource's subsystem ID because there is no rule metric. Rule name: " + realRuleName); + if (null != application.getSubSystemId()) { + subSystemId = application.getSubSystemId().intValue(); + } + if (taskRuleSimpleTemp.getTask() != null && CollectionUtils.isNotEmpty(taskRuleSimpleTemp.getTask().getTaskDataSources())) { + List subSystemIds = taskRuleSimpleTemp.getTask().getTaskDataSources().stream() + .map(taskDataSource -> taskDataSource.getSubSystemId()).filter(ele -> ele != null).collect( + Collectors.toList()); + if (CollectionUtils.isNotEmpty(subSystemIds)) { + Long currentSubSystemId = subSystemIds.iterator().next(); + if (currentSubSystemId != null) { + subSystemId = currentSubSystemId.intValue(); + } + } + } + } else { + // 获取子系统 + if (ruleMetrics.iterator().next().getSubSystemId() != null) { + subSystemId = ruleMetrics.iterator().next().getSubSystemId(); + } + } + // 获取告警人 + List receivers = getReceivers(taskRuleSimpleTemp, AlertTypeEnum.TASK_FAILED.getCode(), StringUtils.isNotBlank(alertReceiver) ? alertReceiver : taskRuleSimpleTemp.getAlertReceiver()); + + // 获取告警级别 + int alertLevel = alertRank != null ? alertRank.intValue() : taskRuleSimpleTemp.getAlertLevel(); + // Alert object: database1[table1,table2];database2[table3,table4] + String alertObj = contructAlertObj(taskRuleSimpleTemp.getTask().getTaskDataSources()); + // 封装告警 + request.put("alert_reciver", StringUtils.join(receivers, ",")); + request.put("alert_info", alertInfo.toString()); + request.put("sub_system_id", subSystemId); + request.put("alert_title", alertTitle); + request.put("alert_level", alertLevel); + request.put("alert_obj", alertObj); + // RTX, EMAIL + request.put("alert_way", imsConfig.getAlertWay()); + request.put("ci_type_id", 55); + requestList.add(request); + + // 保存alarm_info表 + for (String username : receivers) { + AlarmInfo alarmInfo = new AlarmInfo(getAlertLevel(alertLevel), alertInfo.toString(), application.getId(), application.getSubmitTime(), + application.getFinishTime(), application.getFinishTime(), username, AlertTypeEnum.TASK_FAILED.getCode(), realPorjectName); + alarmInfoDao.save(alarmInfo); + } + + if (bdap) { + client.sendAlarm(StringUtils.join(receivers, ","), imsConfig.getTitlePrefix() + "集群 Qualitis 任务告警\n" + , alertInfo.toString(), String.valueOf(alertLevel)); + } + alertInfo.delete(0, alertInfo.length()); + } + // 发送告警 + if (CollectionUtils.isNotEmpty(requestList) && !bdap) { + client.sendNewAlarm(requestList); + } + } + + private static String contructAlertObj(Set taskDataSources) { + Map> dbAndTables = new HashMap<>(taskDataSources.size()); + for (TaskDataSource taskDataSource : taskDataSources) { + String databaseName = taskDataSource.getDatabaseName(); + String tableName = taskDataSource.getTableName(); + + if (dbAndTables.keySet().contains(databaseName)) { + dbAndTables.get(databaseName).add(tableName); + } else { + List tables = new ArrayList<>(); + tables.add(tableName); + + dbAndTables.put(databaseName, tables); + } + } + + List dbs = new ArrayList<>(dbAndTables.keySet().size()); + StringBuilder tempDb = new StringBuilder(); + for (Map.Entry> entry : dbAndTables.entrySet()) { + String key = entry.getKey(); + Object value = entry.getValue(); + dbs.add(tempDb.append(key).append("[").append(StringUtils.join(value, ",")).append("]").toString()); + tempDb.delete(0, tempDb.length()); + } + + return StringUtils.join(dbs, ";"); + } + + public static List getFailedTaskRule(List tasks) { + List taskRuleSimples = new ArrayList<>(); + for (Task task : tasks) { + taskRuleSimples.addAll(task.getTaskRuleSimples()); + } + + return taskRuleSimples; + } + + public static void sendInitFailedMessage(Application application, ApplicationComment applicationComment, List rules, ImsConfig imsConfig, AlarmClient client, AlarmInfoDao alarmInfoDao) { + // 获取告警标题 + String alertTitle = imsConfig.getTitlePrefix() + "集群 Qualitis 任务告警\n"; + + // 获取告警内容 + StringBuilder alertContent = new StringBuilder(); + int maxLevel = Integer.parseInt(ImsLevelEnum.INFO.getCode()); + // 获取告警接受者 + Set receivers = new HashSet<>(); + + StringBuilder alerters = new StringBuilder("请通知:"); + alerters.append("告警接收人"); + for (Rule rule : rules) { + if (StringUtils.isNotBlank(rule.getExecutionParametersName())) { + ExecutionParameters executionParameters = SpringContextHolder.getBean(ExecutionParametersDao.class).findByNameAndProjectId(rule.getExecutionParametersName(), rule.getProject().getId()); + if (executionParameters != null) { + //兼容旧规则数据 + if (StringUtils.isNotBlank(executionParameters.getAlertReceiver())) { + if (!alerters.toString().contains(executionParameters.getAlertReceiver())) { + alerters.append( + StringUtils.isNotEmpty(executionParameters.getAlertReceiver()) ? executionParameters.getAlertReceiver() : "未设置") + .append("或创建用户") + .append(rule.getCreateUser()) + .append(";"); + } + + maxLevel = getMaxLevelAndReceivers(maxLevel, receivers, executionParameters); + continue; + } + + //0.23.0版本 + if (CollectionUtils.isNotEmpty(executionParameters.getAlarmArgumentsExecutionParameters())) { + for (AlarmArgumentsExecutionParameters parameters : executionParameters.getAlarmArgumentsExecutionParameters()) { + if (!alerters.toString().contains(parameters.getAlarmReceiver())) { + alerters.append(StringUtils.isNotEmpty(parameters.getAlarmReceiver()) ? parameters.getAlarmReceiver() : "未设置") + .append("或创建用户") + .append(rule.getCreateUser()) + .append(";"); + } + // Fix: add receivers when init failed. + if (StringUtils.isNotEmpty(parameters.getAlarmReceiver())) { + receivers.addAll(Arrays.asList(parameters.getAlarmReceiver().split(SpecCharEnum.COMMA.getValue()))); + } + if (parameters.getAlarmLevel() != null && parameters.getAlarmLevel() < maxLevel) { + maxLevel = parameters.getAlarmLevel(); + } + } + + } + + } + } else { + alerters.append(StringUtils.isNotEmpty(rule.getAlertReceiver()) ? rule.getAlertReceiver() : "未设置") + .append("或创建用户") + .append(rule.getCreateUser()) + .append(";"); + // Fix: add receivers when init failed. + if (StringUtils.isNotEmpty(rule.getAlertReceiver())) { + receivers.addAll(Arrays.asList(rule.getAlertReceiver().split(SpecCharEnum.COMMA.getValue()))); + } + } + } + alerters.append("\n"); + alertContent.append(alerters); + + alertContent.append("Qualitis项目: ").append(application.getProjectName()).append(",Qualitis任务: ").append(application.getId()) + .append(" 含有技术规则: ").append("\n"); + + for (Rule rule : rules) { + alertContent.append(StringUtils.isNotEmpty(rule.getCnName()) ? rule.getCnName() : rule.getName()).append("\n"); + + } + + if (ApplicationStatusEnum.TASK_SUBMIT_FAILED.getCode().equals(application.getStatus())) { + alertContent.append("初始化失败,失败备注:"); + } + if (applicationComment != null) { + alertContent.append(applicationComment.getZhMessage()).append("\n").append("也可进入 Qualitis 系统查看详情。").append("\n"); + } else { + alertContent.append(application.getExceptionMessage()).append("\n").append("也可进入 Qualitis 系统查看详情。").append("\n"); + } + + Role role = SpringContextHolder.getBean(RoleDao.class).findByRoleName(ADMIN); + Set admins = SpringContextHolder.getBean(UserRoleDao.class).findByRole(role).stream().map(UserRole::getUser).map(User::getUsername).collect(Collectors.toSet()); + + receivers.addAll(admins); + + // 保存alarm_info表 + for (String username : receivers) { + AlarmInfo alarmInfo = new AlarmInfo(maxLevel + "", alertContent.toString(), application.getId(), application.getSubmitTime(), + application.getFinishTime(), application.getFinishTime(), username, AlertTypeEnum.TASK_INIT_FAIL.getCode(), application.getProjectName()); + alarmInfoDao.save(alarmInfo); + } + + // 发送告警 + if (CollectionUtils.isNotEmpty(receivers)) { + client.sendAlarm(StringUtils.join(receivers, ","), alertTitle, alertContent.toString(), maxLevel + ""); + } + } + + public static void sendInitFailedMessage(Application application, CheckAlert checkAlert, ImsConfig imsConfig, AlarmClient client, AlarmInfoDao alarmInfoDao) { + // 获取告警标题 + String alertTitle = imsConfig.getTitlePrefix() + "集群 Qualitis 任务告警\n"; + + // 获取告警内容 + StringBuilder alertContent = new StringBuilder(); + + // 获取告警接受者 + Set receivers = new HashSet<>(); + + StringBuilder alerters = new StringBuilder("请通知:"); + alerters.append("告警接收人").append(checkAlert.getCreateUser()).append(";\n"); + + alertContent.append(alerters); + alertContent.append("Qualitis项目: ").append(application.getProjectName()).append(",Qualitis任务: ").append(application.getId()).append(" 含有告警规则: ").append("\n").append(checkAlert.toString()).append("\n"); + + alertContent.append("初始化失败,失败详情:").append(application.getExceptionMessage()).append("\n").append("也可进入 Qualitis 系统查看详情。").append("\n"); + + Role role = SpringContextHolder.getBean(RoleDao.class).findByRoleName(ADMIN); + Set admins = SpringContextHolder.getBean(UserRoleDao.class).findByRole(role).stream().map(UserRole::getUser).map(User::getUsername).collect(Collectors.toSet()); + + receivers.addAll(admins); + + // 保存alarm_info表 + for (String username : receivers) { + AlarmInfo alarmInfo = new AlarmInfo(ImsLevelEnum.WARNING.getCode(), alertContent.toString(), application.getId(), application.getSubmitTime(), application.getFinishTime(), application.getFinishTime(), username, AlertTypeEnum.TASK_INIT_FAIL.getCode(), application.getProjectName()); + alarmInfoDao.save(alarmInfo); + } + + // 发送告警 + if (CollectionUtils.isNotEmpty(receivers)) { + client.sendAlarm(StringUtils.join(receivers, ","), alertTitle, alertContent.toString(), ImsLevelEnum.WARNING.getCode()); + } + } + + private static int getMaxLevelAndReceivers(int maxLevel, Set receivers, ExecutionParameters executionParameters) { + //告警级别比对、告警人拼接 + if (executionParameters.getAlertLevel() != null && StringUtils.isNotEmpty(executionParameters.getAlertReceiver())) { + String[] receiverSet = executionParameters.getAlertReceiver().split(SpecCharEnum.COMMA.getValue()); + for (String currentReceiver : receiverSet) { + receivers.add(currentReceiver); + } + if (executionParameters.getAlertLevel() < maxLevel) { + maxLevel = executionParameters.getAlertLevel(); + } + } + return maxLevel; + } + + public static void sendAbnormalDataRecordAlarm(ImsConfig imsConfig, AlarmClient alarmClient, List> data) { + alarmClient.sendAbnormalDataRecordAlarm(imsConfig, data); + } +} diff --git a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/util/PassUtil.java b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/util/PassUtil.java index 4b122c4e..1dda2b42 100644 --- a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/util/PassUtil.java +++ b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/util/PassUtil.java @@ -16,25 +16,28 @@ package com.webank.wedatasphere.qualitis.util; +import com.webank.wedatasphere.qualitis.constants.QualitisConstants; import com.webank.wedatasphere.qualitis.dao.TaskResultDao; -import com.webank.wedatasphere.qualitis.entity.*; +import com.webank.wedatasphere.qualitis.entity.TaskResult; +import com.webank.wedatasphere.qualitis.entity.TaskRuleAlarmConfig; +import com.webank.wedatasphere.qualitis.exception.ArgumentException; import com.webank.wedatasphere.qualitis.rule.constant.CheckTemplateEnum; import com.webank.wedatasphere.qualitis.rule.constant.CompareTypeEnum; -import com.webank.wedatasphere.qualitis.submitter.impl.ExecutionManagerImpl; +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; -import com.webank.wedatasphere.qualitis.entity.TaskRuleAlarmConfig; +import java.math.BigDecimal; import java.time.DayOfWeek; import java.time.LocalDate; import java.time.LocalDateTime; import java.time.LocalTime; import java.time.Month; import java.time.ZoneId; -import org.apache.commons.lang.StringUtils; -import org.slf4j.Logger; -import org.slf4j.LoggerFactory; - import java.util.Calendar; import java.util.Date; +import java.util.HashMap; +import java.util.Map; /** * @author howeye @@ -62,15 +65,33 @@ public static Boolean notSafe(String taskId, Long ruleId, TaskRuleAlarmConfig al } catch (NumberFormatException e) { return false; } - + Date nowDate = new Date(); if (checkTemplate.equals(CheckTemplateEnum.MONTH_FLUCTUATION.getCode())) { - Double monthAvg = getMonthAvg(taskResultDao, taskResult.getRuleId()); + long existValueNums = countValue(nowDate, taskResultDao, Calendar.MONTH, taskResult.getRuleId(), taskResult.getRuleMetricId(), taskResult.getApplicationId()); + if (existValueNums <= 0) { + return true; + } + Double monthAvg = getMonthAvg(nowDate, taskResultDao, taskResult.getRuleId(), taskResult.getRuleMetricId(), taskResult.getApplicationId()); + taskResult.setCompareValue(monthAvg != null ? String.valueOf(monthAvg) : ""); + taskResultDao.saveTaskResult(taskResult); return moreThanThresholds(result, monthAvg, thresholds); } else if (checkTemplate.equals(CheckTemplateEnum.WEEK_FLUCTUATION.getCode())) { - Double weekAvg = getWeekAvg(taskResultDao, taskResult.getRuleId()); + long existValueNums = countValue(nowDate, taskResultDao, Calendar.WEEK_OF_MONTH, taskResult.getRuleId(), taskResult.getRuleMetricId(), taskResult.getApplicationId()); + if (existValueNums <= 0) { + return true; + } + Double weekAvg = getWeekAvg(nowDate, taskResultDao, taskResult.getRuleId(), taskResult.getRuleMetricId(), taskResult.getApplicationId()); + taskResult.setCompareValue(weekAvg != null ? String.valueOf(weekAvg) : ""); + taskResultDao.saveTaskResult(taskResult); return moreThanThresholds(result, weekAvg, thresholds); } else if (checkTemplate.equals(CheckTemplateEnum.DAY_FLUCTUATION.getCode())) { - Double dayAvg = getDayAvg(taskResultDao, taskResult.getRuleId()); + long existValueNums = countValue(nowDate, taskResultDao, Calendar.DAY_OF_MONTH, taskResult.getRuleId(), taskResult.getRuleMetricId(), taskResult.getApplicationId()); + if (existValueNums <= 0) { + return true; + } + Double dayAvg = getDayAvg(nowDate, taskResultDao, taskResult.getRuleId(), taskResult.getRuleMetricId(), taskResult.getApplicationId()); + taskResult.setCompareValue(dayAvg != null ? String.valueOf(dayAvg) : ""); + taskResultDao.saveTaskResult(taskResult); return moreThanThresholds(result, dayAvg, thresholds); } else if (checkTemplate.equals(CheckTemplateEnum.FIXED_VALUE.getCode())) { Integer compareType = alarmConfig.getCompareType(); @@ -80,49 +101,72 @@ public static Boolean notSafe(String taskId, Long ruleId, TaskRuleAlarmConfig al Integer compareType = alarmConfig.getCompareType(); thresholds /= 100; try { - result = getRingGrowth(taskResultDao, checkTemplate, ruleId, taskResult.getRuleMetricId()); - } catch (Exception e) { - LOGGER.info("Because the data of the previous period does not exist, the chain ratio cannot be calculated."); - return false; + Map comparedValueMap = new HashMap<>(2); + result = getRingGrowth(taskResultDao, checkTemplate, taskResult.getRuleId(), taskResult.getRuleMetricId(), taskResult.getApplicationId(), comparedValueMap); + StringBuilder computeRex = new StringBuilder(); + String avgOfLastCycle = StringUtils.isNotEmpty(comparedValueMap.get(QualitisConstants.AVG_OF_LAST_CYCLE)) ? comparedValueMap.get(QualitisConstants.AVG_OF_LAST_CYCLE) : ""; + String avgOfCurrent = StringUtils.isNotEmpty(comparedValueMap.get(QualitisConstants.AVG_OF_CURRENT)) ? comparedValueMap.get(QualitisConstants.AVG_OF_CURRENT) : ""; + if (StringUtils.isNotEmpty(avgOfLastCycle) && StringUtils.isNotEmpty(avgOfCurrent)) { + computeRex.append("(").append(avgOfCurrent).append(" - ").append(avgOfLastCycle).append(")").append("/").append(avgOfLastCycle); + taskResult.setCompareValue(computeRex.toString()); + taskResultDao.saveTaskResult(taskResult); + } + } catch (ArgumentException e) { + LOGGER.info("When first check ring growth, pass it."); + return true; } + if (result != null && result.equals(Double.NaN)) { + result = 0.0; + } return moreThanThresholds(result, thresholds, compareType); } } - private static Double getMonthAvg(TaskResultDao taskResultDao, Long ruleId) { - return getAvg(taskResultDao, Calendar.MONTH, ruleId); + private static Double getMonthAvg(Date nowDate, TaskResultDao taskResultDao, Long ruleId, Long ruleMetricId, String applicationId) { + return getAvg(nowDate, taskResultDao, Calendar.MONTH, ruleId, ruleMetricId, applicationId); } - private static Double getWeekAvg(TaskResultDao taskResultDao, Long ruleId) { - return getAvg(taskResultDao, Calendar.WEEK_OF_MONTH, ruleId); + private static Double getWeekAvg(Date nowDate, TaskResultDao taskResultDao, Long ruleId, Long ruleMetricId, String applicationId) { + return getAvg(nowDate, taskResultDao, Calendar.WEEK_OF_MONTH, ruleId, ruleMetricId, applicationId); } - private static Double getDayAvg(TaskResultDao taskResultDao, Long ruleId) { return getAvg(taskResultDao, Calendar.DAY_OF_MONTH, ruleId); } + private static Double getDayAvg(Date nowDate, TaskResultDao taskResultDao, Long ruleId, Long ruleMetricId, String applicationId) { + return getAvg(nowDate, taskResultDao, Calendar.DAY_OF_MONTH, ruleId, ruleMetricId, applicationId); + } - private static Double getAvg(TaskResultDao taskResultDao, Integer calendarStepUnit, Long ruleId) { - Date nowDate = new Date(); + private static Double getAvg(Date nowDate, TaskResultDao taskResultDao, Integer calendarStepUnit, Long ruleId, Long ruleMetricId, String applicationId) { + Calendar calendar = Calendar.getInstance(); + calendar.setTime(nowDate); + calendar.add(calendarStepUnit, -1); + Date lastMonthDate = calendar.getTime(); + + return taskResultDao.findAvgByCreateTimeBetweenAndRuleAndMetricAndApplication(QualitisConstants.PRINT_DATE_FORMAT.format(lastMonthDate), QualitisConstants.PRINT_DATE_FORMAT.format(nowDate), ruleId, ruleMetricId, applicationId); + } + + private static long countValue(Date nowDate, TaskResultDao taskResultDao, Integer calendarStepUnit, Long ruleId, Long ruleMetricId, String applicationId) { Calendar calendar = Calendar.getInstance(); calendar.setTime(nowDate); calendar.add(calendarStepUnit, -1); Date lastMonthDate = calendar.getTime(); - return taskResultDao.findAvgByCreateTimeBetweenAndRule(ExecutionManagerImpl.PRINT_TIME_FORMAT.format(lastMonthDate), ExecutionManagerImpl.PRINT_TIME_FORMAT.format(nowDate), ruleId); + return taskResultDao.countByCreateTimeBetweenAndRuleAndMetricAndApplication(QualitisConstants.PRINT_DATE_FORMAT.format(lastMonthDate), QualitisConstants.PRINT_DATE_FORMAT.format(nowDate), ruleId, ruleMetricId, applicationId); } - private static Double getRingGrowth(TaskResultDao taskResultDao, Integer ringType, Long ruleId, Long ruleMetricId) { + private static Double getRingGrowth(TaskResultDao taskResultDao, Integer ringType, Long ruleId, Long ruleMetricId, String applicationId, Map comparedValueMap) + throws ArgumentException { LocalDateTime localDateTime = LocalDateTime.now(); LocalDateTime start; LocalDateTime end; LocalDateTime startOfLast; - LocalDateTime endOfLast; + if (ringType.equals(CheckTemplateEnum.FULL_YEAR_RING_GROWTH.getCode())) { int year = localDateTime.getYear(); // Location current time area and calculate avg. start = LocalDateTime.of(year, 1, 1, 0, 0, 0); end = LocalDateTime.of(year, 12, 31, 23, 59, 59); startOfLast = LocalDateTime.of(year - 1, 1, 1, 0, 0, 0); - return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId); + return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId, applicationId, comparedValueMap); } else if (ringType.equals(CheckTemplateEnum.HALF_YEAR_GROWTH.getCode())) { int year = localDateTime.getYear(); @@ -137,7 +181,7 @@ private static Double getRingGrowth(TaskResultDao taskResultDao, Integer ringTyp end = LocalDateTime.of(year, 6, 30, 23, 59, 59); startOfLast = LocalDateTime.of(year - 1, 7, 1, 0, 0, 0); } - return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId); + return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId, applicationId, comparedValueMap); } else if (ringType.equals(CheckTemplateEnum.SEASON_RING_GROWTH.getCode())) { int year = localDateTime.getYear(); Month month = localDateTime.getMonth(); @@ -147,12 +191,12 @@ private static Double getRingGrowth(TaskResultDao taskResultDao, Integer ringTyp start = LocalDateTime.of(year, firstMonthOfQuarter.getValue(), 1, 0, 0, 0); end = LocalDateTime.of(year, endMonthOfQuarter.getValue(), endMonthOfQuarter.maxLength(), 23, 59, 59); startOfLast = LocalDateTime.of(year, 1, 1, 0, 0, 0); - return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId); + return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId, applicationId, comparedValueMap); } else { start = LocalDateTime.of(year, firstMonthOfQuarter.getValue(), 1, 0, 0, 0); end = LocalDateTime.of(year, endMonthOfQuarter.getValue(), endMonthOfQuarter.maxLength(), 23, 59, 59); startOfLast = LocalDateTime.of(year - 1, Month.OCTOBER.getValue(), 1, 0, 0, 0); - return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId); + return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId, applicationId, comparedValueMap); } } else if (ringType.equals(CheckTemplateEnum.MONTH_RING_GROWTH.getCode())) { int year = localDateTime.getYear(); @@ -160,29 +204,29 @@ private static Double getRingGrowth(TaskResultDao taskResultDao, Integer ringTyp start = LocalDateTime.of(year, month.getValue(), 1, 0, 0, 0); end = LocalDateTime.of(year, month.getValue(), month.maxLength(), 23, 59, 59); startOfLast = LocalDateTime.of(year, 1, 1, 0, 0, 0); - return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId); + return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId, applicationId, comparedValueMap); } else if (ringType.equals(CheckTemplateEnum.WEEK_RING_GROWTH.getCode())) { DayOfWeek dayOfWeek = localDateTime.getDayOfWeek(); int week = dayOfWeek.getValue(); start = LocalDateTime.of(LocalDate.now(), LocalTime.MIN).minusDays(week); - end = LocalDateTime.of(LocalDate.now(), LocalTime.MAX).plusDays(DayOfWeek.SUNDAY.getValue() - week); + end = LocalDateTime.of(LocalDate.now(), LocalTime.MAX).plusDays(Long.parseLong((DayOfWeek.SUNDAY.getValue() - week) + "")); startOfLast = start.minusWeeks(DayOfWeek.MONDAY.getValue()); - return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId); + return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId, applicationId, comparedValueMap); } else if (ringType.equals(CheckTemplateEnum.DAY_RING_GROWTH.getCode())) { start = LocalDateTime.of(LocalDate.now(), LocalTime.MIN); end = LocalDateTime.of(LocalDate.now(), LocalTime.MAX); startOfLast = start.minusDays(1); - return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId); + return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId, applicationId, comparedValueMap); } else if (ringType.equals(CheckTemplateEnum.HOUR_RING_GROWTH.getCode())) { start = LocalDateTime.of(localDateTime.getYear(), localDateTime.getMonthValue(), localDateTime.getDayOfMonth(), localDateTime.getHour() - , 0, 0); + , 0, 0); end = start.plusHours(1); startOfLast = start.minusHours(1); - return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId); + return specialTimeRingGrowth(start, end, startOfLast, taskResultDao, ruleId, ruleMetricId, applicationId, comparedValueMap); } else if (ringType.equals(CheckTemplateEnum.YEAR_ON_YEAR)) { return yearOnYear(localDateTime, taskResultDao, ruleId, ruleMetricId); } else { - return - 1.0; + return -1.0; } } @@ -195,39 +239,70 @@ private static Double yearOnYear(LocalDateTime localDateTime, TaskResultDao task LocalDateTime startOfLast = LocalDateTime.of(year - 1, month, 1, 0, 0, 0); LocalDateTime endOfLast = LocalDateTime.of(year - 1, month.getValue(), month.maxLength(), 23, 59, 59); - Date startDate = Date.from(start.atZone( ZoneId.systemDefault()).toInstant()); - Date endDate = Date.from(end.atZone( ZoneId.systemDefault()).toInstant()); - Date startOfLastDate = Date.from(startOfLast.atZone( ZoneId.systemDefault()).toInstant()); - Date endOfLastDate = Date.from(endOfLast.atZone( ZoneId.systemDefault()).toInstant()); + Date startDate = Date.from(start.atZone(ZoneId.systemDefault()).toInstant()); + Date endDate = Date.from(end.atZone(ZoneId.systemDefault()).toInstant()); + Date startOfLastDate = Date.from(startOfLast.atZone(ZoneId.systemDefault()).toInstant()); + Date endOfLastDate = Date.from(endOfLast.atZone(ZoneId.systemDefault()).toInstant()); - Double avg = taskResultDao.findAvgByCreateTimeBetweenAndRuleAndRuleMetric(ExecutionManagerImpl.PRINT_TIME_FORMAT.format(startDate) - , ExecutionManagerImpl.PRINT_TIME_FORMAT.format(endDate), ruleId, ruleMetricId); + Double avg = taskResultDao.findAvgByCreateTimeBetweenAndRuleAndRuleMetric(QualitisConstants.PRINT_TIME_FORMAT.format(startDate) + , QualitisConstants.PRINT_TIME_FORMAT.format(endDate), ruleId, ruleMetricId); // Calculate pre time area. - Double avgOfLast = taskResultDao.findAvgByCreateTimeBetweenAndRuleAndRuleMetric(ExecutionManagerImpl.PRINT_TIME_FORMAT.format(startOfLastDate) - , ExecutionManagerImpl.PRINT_TIME_FORMAT.format(endOfLastDate), ruleId, ruleMetricId); + Double avgOfLast = taskResultDao.findAvgByCreateTimeBetweenAndRuleAndRuleMetric(QualitisConstants.PRINT_TIME_FORMAT.format(startOfLastDate) + , QualitisConstants.PRINT_TIME_FORMAT.format(endOfLastDate), ruleId, ruleMetricId); // Growth. LOGGER.info("Finish to get this time ring."); return (avg - avgOfLast) / avgOfLast; } private static Double specialTimeRingGrowth(LocalDateTime start, LocalDateTime end, LocalDateTime startOfLast, TaskResultDao taskResultDao, Long ruleId - , Long ruleMetricId) { + , Long ruleMetricId, String applicationId, Map comparedValueMap) throws ArgumentException { Date startDate; Date endDate; Date startOfLastDate; LOGGER.info("Start to get this time ring."); - startDate = Date.from(start.atZone( ZoneId.systemDefault()).toInstant()); - endDate = Date.from(end.atZone( ZoneId.systemDefault()).toInstant()); - startOfLastDate = Date.from(startOfLast.atZone( ZoneId.systemDefault()).toInstant()); - Double avgOfYear = taskResultDao.findAvgByCreateTimeBetweenAndRuleAndRuleMetric(ExecutionManagerImpl.PRINT_TIME_FORMAT.format(startDate) - , ExecutionManagerImpl.PRINT_TIME_FORMAT.format(endDate), ruleId, ruleMetricId); + startDate = Date.from(start.atZone(ZoneId.systemDefault()).toInstant()); + endDate = Date.from(end.atZone(ZoneId.systemDefault()).toInstant()); + startOfLastDate = Date.from(startOfLast.atZone(ZoneId.systemDefault()).toInstant()); + Double avgOfCurrent = taskResultDao.findAvgByCreateTimeBetweenAndRuleAndRuleMetric(QualitisConstants.PRINT_TIME_FORMAT.format(startDate) + , QualitisConstants.PRINT_TIME_FORMAT.format(endDate), ruleId, ruleMetricId); + if (avgOfCurrent == null) { + avgOfCurrent = Double.valueOf("0"); + LOGGER.info("Avg of current is null, return 0."); + } // Calculate pre time area. - Double avgOfLastYear = taskResultDao.findAvgByCreateTimeBetweenAndRuleAndRuleMetric(ExecutionManagerImpl.PRINT_TIME_FORMAT.format(startOfLastDate) - , ExecutionManagerImpl.PRINT_TIME_FORMAT.format(startDate), ruleId, ruleMetricId); - // Growth. - LOGGER.info("Finish to get this time ring."); - return (avgOfYear - avgOfLastYear) / avgOfLastYear; + // Check for the previous cycle records + long recordNumber = taskResultDao.countByCreateTimeBetweenAndRuleAndRuleMetric(QualitisConstants.PRINT_TIME_FORMAT.format(startOfLastDate) + , QualitisConstants.PRINT_TIME_FORMAT.format(startDate), ruleId, ruleMetricId, applicationId); + + if (recordNumber > 0) { + Double avgOfLastCycle = taskResultDao.findAvgByCreateTimeBetweenAndRuleAndRuleMetric(QualitisConstants.PRINT_TIME_FORMAT.format(startOfLastDate) + , QualitisConstants.PRINT_TIME_FORMAT.format(startDate), ruleId, ruleMetricId); + if (avgOfLastCycle == null) { + avgOfLastCycle = Double.valueOf("0"); + LOGGER.info("Last cycle is null, return 0."); + } + // Growth. + LOGGER.info("Finish to get this time ring."); + comparedValueMap.put(QualitisConstants.AVG_OF_LAST_CYCLE, String.valueOf(avgOfLastCycle)); + comparedValueMap.put(QualitisConstants.AVG_OF_CURRENT, String.valueOf(avgOfCurrent)); + return (avgOfCurrent - avgOfLastCycle) / avgOfLastCycle; + } else { + long recordNumberOfPast = taskResultDao.countByCreateTimeBetweenAndRuleAndRuleMetric("2019-01-01 00:00:00", QualitisConstants.PRINT_TIME_FORMAT.format(startDate) + , ruleId, ruleMetricId, applicationId); + Double avgOfLastCycle = 0.0; + if (recordNumberOfPast > 0) { + LOGGER.info("No records in the previous cycle but not first execution."); + comparedValueMap.put(QualitisConstants.AVG_OF_LAST_CYCLE, String.valueOf(avgOfLastCycle)); + comparedValueMap.put(QualitisConstants.AVG_OF_CURRENT, String.valueOf(avgOfCurrent)); + //Make sure "avgOfLastCycle" can't be zero before doing this division (不做判断,sonar扫描会出错) + //return (avgOfCurrent - avgOfLastCycle) / avgOfLastCycle; 原代码 + return avgOfLastCycle; + } else { + LOGGER.info("No records before."); + throw new ArgumentException("No records before"); + } + } } private static Boolean moreThanThresholds(Double taskResult, Double compareValue, Double percentage) { @@ -235,6 +310,9 @@ private static Boolean moreThanThresholds(Double taskResult, Double compareValue if (taskResult == null) { return true; } else { + if (percentage < 0) { + percentage = 1 - percentage; + } Double maxPercentage = 1 + (percentage / 100); Double minPercentage = 1 - (percentage / 100); @@ -253,7 +331,7 @@ private static Boolean moreThanThresholds(Double taskResult, Double compareValue return true; } if (compareType.equals(CompareTypeEnum.EQUAL.getCode())) { - return taskResult.equals(compareValue); + return BigDecimal.valueOf(taskResult).compareTo(BigDecimal.valueOf(compareValue)) == 0; } else if (compareType.equals(CompareTypeEnum.BIGGER.getCode())) { return taskResult > compareValue; } else if (compareType.equals(CompareTypeEnum.SMALLER.getCode())) { @@ -263,11 +341,10 @@ private static Boolean moreThanThresholds(Double taskResult, Double compareValue } else if (compareType.equals(CompareTypeEnum.SMALLER_EQUAL.getCode())) { return taskResult <= compareValue; } else if (compareType.equals(CompareTypeEnum.NOT_EQUAL.getCode())) { - return !taskResult.equals(compareValue); + return BigDecimal.valueOf(taskResult).compareTo(BigDecimal.valueOf(compareValue)) != 0; } LOGGER.warn("Compare type is not found, {}", compareType); - return null; + return false; } - } diff --git a/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/util/ReportUtil.java b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/util/ReportUtil.java new file mode 100644 index 00000000..965b924b --- /dev/null +++ b/core/monitor/src/main/java/com/webank/wedatasphere/qualitis/util/ReportUtil.java @@ -0,0 +1,155 @@ +package com.webank.wedatasphere.qualitis.util; + +import com.webank.wedatasphere.qualitis.client.AlarmClient; +import com.webank.wedatasphere.qualitis.config.ImsConfig; +import com.webank.wedatasphere.qualitis.constant.AlarmConfigStatusEnum; +import com.webank.wedatasphere.qualitis.constant.SpecCharEnum; +import com.webank.wedatasphere.qualitis.constants.QualitisConstants; +import com.webank.wedatasphere.qualitis.dao.RuleMetricDao; +import com.webank.wedatasphere.qualitis.dao.TaskDataSourceDao; +import com.webank.wedatasphere.qualitis.dao.TaskResultDao; +import com.webank.wedatasphere.qualitis.entity.MetricData; +import com.webank.wedatasphere.qualitis.entity.ReportBatchInfo; +import com.webank.wedatasphere.qualitis.entity.RuleMetric; +import com.webank.wedatasphere.qualitis.entity.Task; +import com.webank.wedatasphere.qualitis.entity.TaskDataSource; +import com.webank.wedatasphere.qualitis.entity.TaskResult; +import com.webank.wedatasphere.qualitis.entity.TaskRuleAlarmConfig; +import com.webank.wedatasphere.qualitis.entity.TaskRuleSimple; +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; +import org.apache.commons.collections.CollectionUtils; +import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; + +import java.util.ArrayList; +import java.util.List; +import java.util.stream.Collectors; + +/** + * @author allenzhou + */ +public class ReportUtil { + private static final String ABNORMAL_VALUE = "abnormal_value: "; + private static final String RULE_METRIC_VALUE = "rule_metric_value: "; + + private static final Logger LOGGER = LoggerFactory.getLogger("monitor"); + + private ReportUtil() { + } + + /** + * Get all pass rule metric. + * @param tasks + * @param ruleMetricDao + * @param imsConfig + * @return + */ + public static List collectTaskResult(List tasks, TaskResultDao taskResultDao, TaskDataSourceDao taskDataSourceDao, + RuleMetricDao ruleMetricDao, ImsConfig imsConfig) { + List allReportBatchInfo = new ArrayList<>(tasks.size()); + for (Task task : tasks) { + for (TaskRuleSimple taskRuleSimple : task.getTaskRuleSimples()) { + long collectTimestamp = System.currentTimeMillis(); + + // Rule attrs + String ruleName = StringUtils.isNotBlank(taskRuleSimple.getCnName()) ?taskRuleSimple.getCnName() : taskRuleSimple.getRuleName(); + + ReportBatchInfo ruleMetricReportBatchInfo = new ReportBatchInfo(); + ruleMetricReportBatchInfo.setUserAuthKey(imsConfig.getUserAuthKey()); + List metricDatas = new ArrayList<>(); + + ReportBatchInfo abnormalReportBatchInfo = new ReportBatchInfo(); + abnormalReportBatchInfo.setUserAuthKey(imsConfig.getUserAuthKey()); + List abnormalMetricDatas = new ArrayList<>(); + + // 相关数据源:指标值与异常值 + List taskDataSources = taskDataSourceDao.findByTaskAndRuleId(task, taskRuleSimple.getRuleId()); + + // Task result. + for (TaskRuleAlarmConfig taskRuleAlarmConfig : taskRuleSimple.getTaskRuleAlarmConfigList()) { + // 根据是否上报开关,保存需要上报的指标值 + if (taskRuleAlarmConfig.getStatus().equals(AlarmConfigStatusEnum.PASS.getCode()) + && taskRuleAlarmConfig.getUploadRuleMetricValue() != null && taskRuleAlarmConfig.getUploadRuleMetricValue() && + taskRuleAlarmConfig.getRuleMetric() != null) { + Long ruleMetricId = taskRuleAlarmConfig.getRuleMetric().getId(); + List taskResults = taskResultDao.findByApplicationAndRule(taskRuleSimple.getApplicationId() + , taskRuleSimple.getRuleId()).stream().filter(tr -> tr.getRuleMetricId().equals(ruleMetricId)) + .collect(Collectors.toList()); + if (CollectionUtils.isNotEmpty(taskResults)) { + for (TaskResult taskResult : taskResults) { + RuleMetric ruleMetric = ruleMetricDao.findById(taskResult.getRuleMetricId()); + metricDatas + .add(constructMetaData(ruleMetric, taskResult, imsConfig, ruleName, collectTimestamp, taskDataSources)); + } + } + } + // 根据是否上报异常数据开关,保存需要上报的异常值 + if (taskRuleAlarmConfig.getStatus().equals(AlarmConfigStatusEnum.NOT_PASS.getCode()) + && taskRuleAlarmConfig.getUploadAbnormalValue() != null && taskRuleAlarmConfig.getUploadAbnormalValue()) { + Long ruleMetricId = taskRuleAlarmConfig.getRuleMetric().getId(); + List taskResults = taskResultDao.findByApplicationAndRule(taskRuleSimple.getApplicationId() + , taskRuleSimple.getRuleId()).stream().filter(tr -> tr.getRuleMetricId().equals(ruleMetricId)).collect(Collectors.toList()); + if (CollectionUtils.isNotEmpty(taskResults)) { + for (TaskResult taskResult : taskResults) { + RuleMetric ruleMetric = ruleMetricDao.findById(taskResult.getRuleMetricId()); + abnormalMetricDatas.add(constructMetaData(ruleMetric, taskResult, imsConfig, ruleName, collectTimestamp, taskDataSources)); + } + } + } + + } + if (CollectionUtils.isNotEmpty(metricDatas)) { + ruleMetricReportBatchInfo.setMetricDataList(metricDatas); + allReportBatchInfo.add(ruleMetricReportBatchInfo); + } + if (CollectionUtils.isNotEmpty(abnormalMetricDatas)) { + abnormalReportBatchInfo.setMetricDataList(abnormalMetricDatas); + allReportBatchInfo.add(abnormalReportBatchInfo); + } + + } + } + return allReportBatchInfo; + } + + private static MetricData constructMetaData(RuleMetric ruleMetric, TaskResult taskResult, ImsConfig imsConfig, String ruleName, + long collectTimestamp, List taskDataSources) { + MetricData metricData = new MetricData(); + metricData.setMetricValue(StringUtils.isBlank(taskResult.getValue()) ? "0" : taskResult.getValue()); + metricData.setHostIp(QualitisConstants.QUALITIS_SERVER_HOST); + if (ruleMetric.getSubSystemId() != null) { + metricData.setSubsystemId(String.valueOf(ruleMetric.getSubSystemId())); + } else { + metricData.setSubsystemId(imsConfig.getSystemId()); + } + metricData.setInterfaceName("" + spliceDatabaseAndTable(taskDataSources)); + metricData.setAttrGroup(ruleName); + metricData.setAttrName(ruleMetric.getName() + (StringUtils.isNotEmpty(taskResult.getEnvName()) ? taskResult.getEnvName() : "")); + metricData.setCollectTimestamp(collectTimestamp + ""); + return metricData; + } + + private static String spliceDatabaseAndTable(List taskDataSources) { + StringBuilder databaseAndTable = new StringBuilder(); + for (TaskDataSource taskDataSource : taskDataSources) { + if (StringUtils.isNotEmpty(taskDataSource.getDatabaseName())) { + databaseAndTable.append(taskDataSource.getDatabaseName()).append(SpecCharEnum.BOTTOM_BAR.getValue()); + } + if (StringUtils.isNotEmpty(taskDataSource.getTableName())) { + databaseAndTable.append(taskDataSource.getTableName()).append(SpecCharEnum.BOTTOM_BAR.getValue()); + } + if (StringUtils.isNotEmpty(databaseAndTable.toString())) { + break; + } + } + return databaseAndTable.length() > 0 ? databaseAndTable.substring(0, databaseAndTable.length() - 1) : ""; + } + + public static void reportTaskResult(ReportBatchInfo reportBatchInfos, AlarmClient alarmClient) throws UnExpectedRequestException { + if (CollectionUtils.isEmpty(reportBatchInfos.getMetricDataList())) { + throw new UnExpectedRequestException("Report metric data {&CAN_NOT_BE_NULL_OR_EMPTY}"); + } + alarmClient.report(reportBatchInfos); + } +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/dao/CheckAlertDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/dao/CheckAlertDao.java new file mode 100644 index 00000000..d24e1a1a --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/dao/CheckAlertDao.java @@ -0,0 +1,139 @@ +package com.webank.wedatasphere.qualitis.checkalert.dao; + +import com.webank.wedatasphere.qualitis.checkalert.entity.CheckAlert; +import com.webank.wedatasphere.qualitis.project.entity.Project; +import com.webank.wedatasphere.qualitis.rule.entity.RuleGroup; +import org.springframework.data.domain.Page; + +import java.util.List; +import java.util.Map; + +/** + * @author allenzhou@webank.com + * @date 2023/3/1 15:40 + */ +public interface CheckAlertDao { + /** + * Save + * @param checkAlert + * @return + */ + CheckAlert save(CheckAlert checkAlert); + + /** + * find by id + * @param checkAlertId + * @return + */ + CheckAlert findById(Long checkAlertId); + + /** + * delete + * @param checkAlert + */ + void delete(CheckAlert checkAlert); + + /** + * delete all + * @param checkAlerts + */ + void deleteAll(List checkAlerts); + + /** + * find by topic, workflow info + * @param topic + * @param workFlowName + * @param workFlowVersion + * @param workFlowSpace + * @param projectId + * @return + */ + CheckAlert findByTopicAndWorkflowInfo(String topic, String workFlowName, String workFlowVersion, String workFlowSpace, Long projectId); + + /** + * find by rule group + * @param ruleGroupInDb + * @return + */ + List findByRuleGroup(RuleGroup ruleGroupInDb); + + /** + * find by project and workflow and topic + * @param project + * @param workflowName + * @param topic + * @return + */ + CheckAlert findByProjectAndWorkflowNameAndTopic(Project project, String workflowName, String topic); + + /** + * find by project and topic with lowest version + * @param projectId + * @param topic + * @return + */ + CheckAlert findLowestVersionByProjectAndTopic(Long projectId, String topic); + + /** + * count by project and topic + * @param projectId + * @param topic + * @return + */ + int countByProjectAndTopic(Long projectId, String topic); + + /** + * find by project and workflow and topics + * @param project + * @param workflowName + * @param topics + * @return + */ + List findByProjectAndWorkflowNameAndTopics(Project project, String workflowName, List topics); + + /** + * find by topic, workflow name, version and project + * @param topic + * @param workFlowName + * @param version + * @param projectId + * @return + */ + Long selectMateCheckAlert(String topic, String workFlowName, String version, Long projectId); + + /** + * get Deduplication Field + * @return + * @param projectId + */ + List> getDeduplicationField(Long projectId); + + /** + * check Alert Query + * @param topic + * @param alertTable + * @param workFlowSpace + * @param workFlowProject + * @param workFlowName + * @param nodeName + * @param createUser + * @param modifyUser + * @param startCreateTime + * @param endCreateTime + * @param startModifyTime + * @param endModifyTime + * @param projectId + * @param page + * @param size + * @return + */ + Page checkAlertQuery(String topic, String alertTable, String workFlowSpace, String workFlowProject, String workFlowName, String nodeName + , String createUser, String modifyUser, String startCreateTime, String endCreateTime, String startModifyTime, String endModifyTime, Long projectId, int page, int size); + + /** + * Save all + * @param checkAlerts + * @return + */ + List saveAll(List checkAlerts); +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/dao/impl/CheckAlertDaoImpl.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/dao/impl/CheckAlertDaoImpl.java new file mode 100644 index 00000000..f090ee14 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/dao/impl/CheckAlertDaoImpl.java @@ -0,0 +1,105 @@ +package com.webank.wedatasphere.qualitis.checkalert.dao.impl; + +import com.webank.wedatasphere.qualitis.checkalert.dao.CheckAlertDao; +import com.webank.wedatasphere.qualitis.checkalert.dao.repository.CheckAlertRepository; +import com.webank.wedatasphere.qualitis.checkalert.entity.CheckAlert; +import com.webank.wedatasphere.qualitis.project.entity.Project; +import com.webank.wedatasphere.qualitis.rule.entity.RuleGroup; +import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.domain.Page; +import org.springframework.data.domain.PageRequest; +import org.springframework.data.domain.Pageable; +import org.springframework.data.domain.Sort; +import org.springframework.stereotype.Repository; + +import java.util.List; +import java.util.Map; + +/** + * @author allenzhou@webank.com + * @date 2023/3/1 15:43 + */ +@Repository +public class CheckAlertDaoImpl implements CheckAlertDao { + + @Autowired + private CheckAlertRepository checkAlertRepository; + + @Override + public CheckAlert save(CheckAlert checkAlert) { + return checkAlertRepository.save(checkAlert); + } + + @Override + public CheckAlert findById(Long checkAlertId) { + if (checkAlertRepository.findById(checkAlertId).isPresent()) { + return checkAlertRepository.findById(checkAlertId).get(); + } else { + return null; + } + } + + @Override + public void delete(CheckAlert checkAlert) { + checkAlertRepository.delete(checkAlert); + } + + @Override + public void deleteAll(List checkAlerts) { + checkAlertRepository.deleteAll(checkAlerts); + } + + @Override + public CheckAlert findByTopicAndWorkflowInfo(String topic, String workFlowName, String workFlowVersion, String workFlowSpace, Long projectId) { + return checkAlertRepository.findByTopicAndWorkflowInfo(topic, workFlowName, workFlowVersion, workFlowSpace, projectId); + } + + @Override + public List findByRuleGroup(RuleGroup ruleGroupInDb) { + return checkAlertRepository.findByRuleGroup(ruleGroupInDb); + } + + @Override + public CheckAlert findByProjectAndWorkflowNameAndTopic(Project project, String workflowName, String topic) { + return checkAlertRepository.findByProjectAndWorkflowNameAndTopic(project.getId(), workflowName, topic); + } + + @Override + public CheckAlert findLowestVersionByProjectAndTopic(Long projectId, String topic) { + return checkAlertRepository.findLowestVersionByProjectAndTopic(projectId, topic); + } + + @Override + public int countByProjectAndTopic(Long projectId, String topic) { + return checkAlertRepository.countByProjectAndTopic(projectId, topic); + } + + @Override + public List findByProjectAndWorkflowNameAndTopics(Project project, String workflowName, List topics) { + return checkAlertRepository.findByProjectAndWorkflowNameAndTopics(project.getId(), workflowName, topics); + } + + @Override + public Long selectMateCheckAlert(String topic, String workFlowName, String version, Long projectId) { + return checkAlertRepository.selectMateCheckAlert(topic, workFlowName, version, projectId); + } + + @Override + public List> getDeduplicationField(Long projectId) { + return checkAlertRepository.getDeduplicationField(projectId); + } + + @Override + public Page checkAlertQuery(String topic, String alertTable, String workFlowSpace, String workFlowProject, String workFlowName, + String nodeName, String createUser, String modifyUser, String startCreateTime, String endCreateTime, String startModifyTime, String endModifyTime, Long projectId, int page, int size) { + Sort sort = Sort.by(Sort.Direction.DESC, "id"); + Pageable pageable = PageRequest.of(page, size, sort); + return checkAlertRepository.checkAlertQuery(topic, alertTable, workFlowSpace, workFlowProject, workFlowName, + nodeName, createUser, modifyUser, startCreateTime, endCreateTime, startModifyTime, endModifyTime, projectId, pageable); + } + + @Override + public List saveAll(List checkAlerts) { + return checkAlertRepository.saveAll(checkAlerts); + } +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/dao/repository/CheckAlertRepository.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/dao/repository/CheckAlertRepository.java new file mode 100644 index 00000000..82d1b73a --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/dao/repository/CheckAlertRepository.java @@ -0,0 +1,166 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.checkalert.dao.repository; + +import com.webank.wedatasphere.qualitis.checkalert.entity.CheckAlert; +import com.webank.wedatasphere.qualitis.rule.entity.RuleGroup; +import org.springframework.data.domain.Page; +import org.springframework.data.domain.Pageable; +import org.springframework.data.jpa.repository.JpaRepository; +import org.springframework.data.jpa.repository.Query; + +import java.util.List; +import java.util.Map; + +/** + * @author allenzhou + */ +public interface CheckAlertRepository extends JpaRepository { + + /** + * find by topic, workflow info + * @param topic + * @param workFlowName + * @param workFlowVersion + * @param workFlowSpace + * @param projectId + * @return + */ + @Query(value = "select qac.* from qualitis_alert_config qac where qac.topic = ?1 and qac.work_flow_name = ?2 and qac.work_flow_version = ?3 and qac.work_flow_space = ?4 and qac.project_id = ?5", nativeQuery = true) + CheckAlert findByTopicAndWorkflowInfo(String topic, String workFlowName, String workFlowVersion, String workFlowSpace, Long projectId); + + /** + * find by rule group + * @param ruleGroupInDb + * @return + */ + List findByRuleGroup(RuleGroup ruleGroupInDb); + + /** + * find by project and workflow and topic + * @param id + * @param workflowName + * @param topic + * @return + */ + @Query(value = "select qac.* from qualitis_alert_config qac where qac.project_id = ?1 and qac.work_flow_name = ?2 and qac.topic = ?3", nativeQuery = true) + CheckAlert findByProjectAndWorkflowNameAndTopic(Long id, String workflowName, String topic); + + /** + * find by project and topic with lowest version + * @param projectId + * @param topic + * @return + */ + @Query(value = "select qac.*, 0+RIGHT(work_flow_version, 6) AS workFlowVersion from qualitis_alert_config qac where qac.project_id = ?1 and qac.topic = ?2 ORDER BY workFlowVersion ASC limit 1", nativeQuery = true) + CheckAlert findLowestVersionByProjectAndTopic(Long projectId, String topic); + + /** + *count by project and topic + * @param projectId + * @param topic + * @return + */ + @Query(value = "select count(qac.id) from qualitis_alert_config qac where qac.project_id = ?1 and qac.topic = ?2", nativeQuery = true) + int countByProjectAndTopic(Long projectId, String topic); + + /** + * find by project and workflow and topics + * @param projectId + * @param workflowName + * @param topics + * @return + */ + @Query(value = "select qac.* from qualitis_alert_config qac where qac.project_id = ?1 and qac.work_flow_name = ?2 and qac.topic in ?3", nativeQuery = true) + List findByProjectAndWorkflowNameAndTopics(Long projectId, String workflowName, List topics); + + /** + * find by topic, workflow name, version and project + * @param topic + * @param workFlowName + * @param version + * @param projectId + * @return + */ + @Query(value = "select qac.id from qualitis_alert_config qac where qac.topic = ?1 and qac.work_flow_name = ?2 and qac.work_flow_version= ?3 and qac.project_id= ?4 ", nativeQuery = true) + Long selectMateCheckAlert(String topic, String workFlowName, String version, Long projectId); + + + /** + * check Alert Query + * + * @param topic + * @param alertTable + * @param workFlowSpace + * @param workFlowProject + * @param workFlowName + * @param nodeName + * @param createUser + * @param modifyUser + * @param startCreateTime + * @param endCreateTime + * @param startModifyTime + * @param endModifyTime + * @param projectId + * @param pageable + * @return + */ + @Query(value = "SELECT qr.* FROM qualitis_alert_config qr,qualitis_project qt where qr.project_id=qt.id and qr.project_id=?13 " + + "and if(?1 is null, 1=1, qr.topic like ?1) and if(?2 is null, 1=1, qr.alert_table = ?2) and if(?3 is null,1=1, qr.work_flow_space = ?3) " + + "and if(?4 is null,1=1, qt.name = ?4) and if(?5 is null,1=1,qr.work_flow_name = ?5) " + + "and if(?6 is null,1=1, qr.node_name = ?6) " + + "and if(?7 is null,1=1,qr.create_user = ?7) and if(?8 is null,1=1,qr.modify_user = ?8) " + + "and if(?9 is null,1=1,qr.create_time >= ?9) and if(?10 is null,1=1,qr.create_time <= ?10) " + + "and if(?11 is null,1=1,qr.modify_time >= ?11) and if(?12 is null,1=1,qr.modify_time <= ?12) " + + "group by qr.id" + , countQuery = "SELECT COUNT(0) FROM (SELECT qr.* FROM qualitis_alert_config qr,qualitis_project qt where qr.project_id=qt.id and qr.project_id=?13 " + + "and if(?1 is null, 1=1, qr.topic like ?1) and if(?2 is null, 1=1, qr.alert_table = ?2) and if(?3 is null,1=1,qr.work_flow_space = ?3) " + + "and if(?4 is null,1=1, qt.name = ?4) and if(?5 is null,1=1,qr.work_flow_name = ?5) " + + "and if(?6 is null,1=1, qr.node_name = ?6) " + + "and if(?7 is null,1=1,qr.create_user = ?7) and if(?8 is null,1=1,qr.modify_user = ?8) " + + "and if(?9 is null,1=1,qr.create_time >= ?9) and if(?10 is null,1=1,qr.create_time <= ?10) " + + "and if(?11 is null,1=1,qr.modify_time >= ?11) and if(?12 is null,1=1,qr.modify_time <= ?12) " + + "group by qr.id) as a" + , nativeQuery = true) + Page checkAlertQuery(String topic, String alertTable, String workFlowSpace, String workFlowProject, + String workFlowName, String nodeName, String createUser, String modifyUser, String startCreateTime, + String endCreateTime, String startModifyTime, String endModifyTime, Long projectId, Pageable pageable); + + + + /** + * get Deduplication Field + * + * @param projectId + * @return + */ + @Query(value ="select DISTINCT qa.work_flow_space as workFlowSpace,'' as workFlowProject,''as workFlowName,'' as nodeName,'' as alertTable from " + + " qualitis_alert_config qa,qualitis_project qp where qa.project_id =qp.id and project_id =?1 and qa.work_flow_space is not null " + + " union " + + " select DISTINCT '' as workFlowSpace,qp.name as workFlowProject,''as workFlowName,'' as nodeName,'' as alertTable from " + + " qualitis_alert_config qa,qualitis_project qp where qa.project_id =qp.id and project_id =?1 and qp.name is not null " + + " union " + + " select DISTINCT '' as workFlowSpace,'' as workFlowProject,qa.work_flow_name as workFlowName,'' as nodeName,'' as alertTable from " + + " qualitis_alert_config qa,qualitis_project qp where qa.project_id =qp.id and project_id =?1 and qa.work_flow_name is not null " + + " union " + + " select DISTINCT '' as workFlowSpace,'' as workFlowProject,'' as workFlowName,qa.node_name as nodeName,'' as alertTable from " + + " qualitis_alert_config qa,qualitis_project qp where qa.project_id =qp.id and project_id =?1 and qa.node_name is not null " + + " union " + + " select DISTINCT '' as workFlowSpace,'' as workFlowProject,'' as workFlowName,'' as nodeName,qa.alert_table as alertTable from " + + " qualitis_alert_config qa,qualitis_project qp where qa.project_id =qp.id and project_id =?1 and qa.node_name is not null ",nativeQuery = true) + List> getDeduplicationField(Long projectId); + } diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/dao/repository/CheckAlertWhiteListRepository.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/dao/repository/CheckAlertWhiteListRepository.java new file mode 100644 index 00000000..8d9f03fb --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/dao/repository/CheckAlertWhiteListRepository.java @@ -0,0 +1,37 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.checkalert.dao.repository; + +import com.webank.wedatasphere.qualitis.checkalert.entity.CheckAlertWhiteList; +import org.springframework.data.jpa.repository.JpaRepository; +import org.springframework.data.jpa.repository.Query; + +/** + * @author allenzhou + */ +public interface CheckAlertWhiteListRepository extends JpaRepository { + + /** + * Check white list + * @param item + * @param type + * @param user + * @return + */ + @Query(value = "SELECT qaw.* FROM qualitis_alert_whitelist qaw WHERE qaw.item = ?1 AND qaw.type = ?2 AND qaw.authorized_user = ?3", nativeQuery = true) + CheckAlertWhiteList checkWhiteList(String item, Integer type, String user); +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/entity/CheckAlert.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/entity/CheckAlert.java new file mode 100644 index 00000000..925879e8 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/entity/CheckAlert.java @@ -0,0 +1,272 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.checkalert.entity; + +import com.fasterxml.jackson.annotation.JsonIdentityInfo; +import com.fasterxml.jackson.annotation.ObjectIdGenerators; +import com.webank.wedatasphere.qualitis.project.entity.Project; +import com.webank.wedatasphere.qualitis.rule.entity.RuleGroup; +import org.codehaus.jackson.annotate.JsonIgnore; + +import javax.persistence.Column; +import javax.persistence.Entity; +import javax.persistence.GeneratedValue; +import javax.persistence.GenerationType; +import javax.persistence.Id; +import javax.persistence.ManyToOne; +import javax.persistence.Table; + +/** + * @author allenzhou + */ +@Entity +@JsonIdentityInfo(generator = ObjectIdGenerators.IntSequenceGenerator.class, property = "@id") +@Table(name = "qualitis_alert_config") +public class CheckAlert { + @Id + @GeneratedValue(strategy = GenerationType.IDENTITY) + private Long id; + + private String topic; + + @Column(name = "info_receiver") + private String infoReceiver; + @Column(name = "major_receiver") + private String majorReceiver; + + @Column(name = "alert_table") + private String alertTable; + + private String filter; + + @Column(name = "alert_col") + private String alertCol; + @Column(name = "major_alert_col") + private String majorAlertCol; + + @Column(name = "content_cols") + private String contentCols; + + @Column(name = "create_user", length = 50) + private String createUser; + @Column(name = "create_time", length = 25) + private String createTime; + @Column(name = "modify_user", length = 50) + private String modifyUser; + @Column(name = "modify_time", length = 25) + private String modifyTime; + + @Column(name = "node_name") + private String nodeName; + + @Column(name = "work_flow_name") + private String workFlowName; + + @Column(name = "work_flow_version") + private String workFlowVersion; + + @Column(name = "work_flow_space") + private String workFlowSpace; + + @ManyToOne + @JsonIgnore + private Project project; + + @ManyToOne + @JsonIgnore + private RuleGroup ruleGroup; + + public CheckAlert() { + // Do nothing. + } + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public String getTopic() { + return topic; + } + + public void setTopic(String topic) { + this.topic = topic; + } + + public String getInfoReceiver() { + return infoReceiver; + } + + public void setInfoReceiver(String infoReceiver) { + this.infoReceiver = infoReceiver; + } + + public String getMajorReceiver() { + return majorReceiver; + } + + public void setMajorReceiver(String majorReceiver) { + this.majorReceiver = majorReceiver; + } + + public String getAlertTable() { + return alertTable; + } + + public void setAlertTable(String alertTable) { + this.alertTable = alertTable; + } + + public String getFilter() { + return filter; + } + + public void setFilter(String filter) { + this.filter = filter; + } + + public String getAlertCol() { + return alertCol; + } + + public void setAlertCol(String alertCol) { + this.alertCol = alertCol; + } + + public String getMajorAlertCol() { + return majorAlertCol; + } + + public void setMajorAlertCol(String majorAlertCol) { + this.majorAlertCol = majorAlertCol; + } + + public String getContentCols() { + return contentCols; + } + + public void setContentCols(String contentCols) { + this.contentCols = contentCols; + } + + public String getCreateUser() { + return createUser; + } + + public void setCreateUser(String createUser) { + this.createUser = createUser; + } + + public String getCreateTime() { + return createTime; + } + + public void setCreateTime(String createTime) { + this.createTime = createTime; + } + + public String getModifyUser() { + return modifyUser; + } + + public void setModifyUser(String modifyUser) { + this.modifyUser = modifyUser; + } + + public String getModifyTime() { + return modifyTime; + } + + public void setModifyTime(String modifyTime) { + this.modifyTime = modifyTime; + } + + public String getNodeName() { + return nodeName; + } + + public void setNodeName(String nodeName) { + this.nodeName = nodeName; + } + + public String getWorkFlowName() { + return workFlowName; + } + + public void setWorkFlowName(String workFlowName) { + this.workFlowName = workFlowName; + } + + public String getWorkFlowVersion() { + return workFlowVersion; + } + + public void setWorkFlowVersion(String workFlowVersion) { + this.workFlowVersion = workFlowVersion; + } + + public Project getProject() { + return project; + } + + public void setProject(Project project) { + this.project = project; + } + + public RuleGroup getRuleGroup() { + return ruleGroup; + } + + public void setRuleGroup(RuleGroup ruleGroup) { + this.ruleGroup = ruleGroup; + } + + public String getWorkFlowSpace() { + return workFlowSpace; + } + + public void setWorkFlowSpace(String workFlowSpace) { + this.workFlowSpace = workFlowSpace; + } + + @Override + public String toString() { + return "CheckAlert{" + + "id=" + id + + ", topic='" + topic + '\'' + + ", infoReceiver='" + infoReceiver + '\'' + + ", majorReceiver='" + majorReceiver + '\'' + + ", alertTable='" + alertTable + '\'' + + ", filter='" + filter + '\'' + + ", alertCol='" + alertCol + '\'' + + ", majorAlertCol='" + majorAlertCol + '\'' + + ", contentCols='" + contentCols + '\'' + + ", createUser='" + createUser + '\'' + + ", createTime='" + createTime + '\'' + + ", modifyUser='" + modifyUser + '\'' + + ", modifyTime='" + modifyTime + '\'' + + ", nodeName='" + nodeName + '\'' + + ", workFlowName='" + workFlowName + '\'' + + ", workFlowVersion='" + workFlowVersion + '\'' + + ", project=" + project + + ", ruleGroup=" + ruleGroup + + '}'; + } +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/entity/CheckAlertWhiteList.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/entity/CheckAlertWhiteList.java new file mode 100644 index 00000000..0b4f1afd --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/checkalert/entity/CheckAlertWhiteList.java @@ -0,0 +1,112 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.checkalert.entity; + +import com.fasterxml.jackson.annotation.JsonIdentityInfo; +import com.fasterxml.jackson.annotation.ObjectIdGenerators; +import javax.persistence.Column; +import javax.persistence.Entity; +import javax.persistence.GeneratedValue; +import javax.persistence.GenerationType; +import javax.persistence.Table; +import javax.persistence.Id; + +/** + * @author allenzhou + */ +@Entity +@JsonIdentityInfo(generator = ObjectIdGenerators.IntSequenceGenerator.class, property = "@id") +@Table(name = "qualitis_alert_whitelist") +public class CheckAlertWhiteList { + @Id + @GeneratedValue(strategy = GenerationType.IDENTITY) + private Long id; + @Column(name = "item") + private String item; + @Column(name = "type") + private Integer type; + @Column(name = "authorized_user") + private String authorizedUser; + @Column(name = "create_time") + private String create_time; + @Column(name = "modify_time") + private String modify_time; + + public CheckAlertWhiteList() { + // Default do nothing. + } + + public Long getId() { + return id; + } + + public void setId(Long id) { + this.id = id; + } + + public String getItem() { + return item; + } + + public void setItem(String item) { + this.item = item; + } + + public Integer getType() { + return type; + } + + public void setType(Integer type) { + this.type = type; + } + + public String getAuthorizedUser() { + return authorizedUser; + } + + public void setAuthorizedUser(String authorizedUser) { + this.authorizedUser = authorizedUser; + } + + public String getCreate_time() { + return create_time; + } + + public void setCreate_time(String create_time) { + this.create_time = create_time; + } + + public String getModify_time() { + return modify_time; + } + + public void setModify_time(String modify_time) { + this.modify_time = modify_time; + } + + @Override + public String toString() { + return "CheckAlertWhiteList{" + + "id=" + id + + ", item='" + item + '\'' + + ", type=" + type + + ", authorizedUser='" + authorizedUser + '\'' + + ", create_time='" + create_time + '\'' + + ", modify_time='" + modify_time + '\'' + + '}'; + } +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/constant/OperateTypeEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/constant/OperateTypeEnum.java new file mode 100644 index 00000000..a4af580d --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/constant/OperateTypeEnum.java @@ -0,0 +1,81 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.project.constant; + +/** + * @author allenzhou + */ +public enum OperateTypeEnum { + /** + * Type of project + */ + UNAUTHORIZE_PROJECT(12, "Unauthorized Project", "取消授权项目"), + AUTHORIZE_PROJECT(11, "Authorized Project", "授权项目"), + SUBMIT_PROJECT(10, "Submit Project", "提交项目"), + CREATE_PROJECT(1, "Create Project", "创建项目"), + IMPORT_PROJECT(2, "Import Project", "导入项目"), + EXPORT_PROJECT(3, "Export Project","导出项目"), + DELETE_PROJECT(4, "Create Project", "删除项目"), + MODIFY_PROJECT(5, "Modify Project", "修改项目"), + CREATE_RULES(6,"Create Project","创建规则"), + MODIFY_RULES(7,"Modify Project","修改规则"), + DELETE_RULES(8,"Delete Project","删除规则"), + ; + + private Integer code; + private String message; + private String name; + + OperateTypeEnum(Integer code, String message, String name) { + this.code = code; + this.message = message; + this.name = name; + } + + public static OperateTypeEnum fromCode(Integer code) { + for (OperateTypeEnum operateTypeEnum : OperateTypeEnum.values()) { + if (operateTypeEnum.getCode().equals(code)) { + return operateTypeEnum; + } + } + return null; + } + + public Integer getCode() { + return code; + } + + public void setCode(Integer code) { + this.code = code; + } + + public String getName() { + return name; + } + + public void setName(String name) { + this.name = name; + } + + public String getMessage() { + return message; + } + + public void setMessage(String message) { + this.message = message; + } +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/constant/SwitchTypeEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/constant/SwitchTypeEnum.java new file mode 100644 index 00000000..1c534c24 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/constant/SwitchTypeEnum.java @@ -0,0 +1,51 @@ +package com.webank.wedatasphere.qualitis.project.constant; + +import com.google.common.collect.Maps; + +import java.util.ArrayList; +import java.util.List; +import java.util.Map; + +/** + * @author v_gaojiedeng@webank.com + */ +public enum SwitchTypeEnum { + + /** + * Switch Type + */ + AUTO_MATIC(true, "自动"), + HAND_MOVEMENT(false, "手动"), + ; + + private Boolean code; + private String message; + + SwitchTypeEnum(Boolean code, String message) { + this.code = code; + this.message = message; + } + + public Boolean getCode() { + return code; + } + + public String getMessage() { + return message; + } + + public static List> getSwitchTypeEnumList() { + List> list = new ArrayList>(); + for (SwitchTypeEnum switchTypeEnum : SwitchTypeEnum.values()) { + Map item = Maps.newHashMap(); + item.put("code", switchTypeEnum.code); + item.put("message", switchTypeEnum.message); + list.add(item); + + } + return list; + } + + + +} diff --git a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectDao.java similarity index 89% rename from web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectDao.java rename to core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectDao.java index 7e073a55..a8bf3cb2 100644 --- a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectDao.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectDao.java @@ -55,6 +55,12 @@ public interface ProjectDao { */ List findAllProject(int page, int size); + /** + * get all project + * @return + */ + List findAll(); + /** * Count all projects * @return @@ -80,4 +86,11 @@ public interface ProjectDao { * @return */ List findByCreateUser(String createUser); + + /** + * find All By Id + * @param projectIds + * @return + */ + List findAllById(List projectIds); } diff --git a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectEventDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectEventDao.java similarity index 56% rename from web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectEventDao.java rename to core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectEventDao.java index 1da3493b..34f7a73f 100644 --- a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectEventDao.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectEventDao.java @@ -1,7 +1,8 @@ package com.webank.wedatasphere.qualitis.project.dao; -import com.webank.wedatasphere.qualitis.project.entity.Project; import com.webank.wedatasphere.qualitis.project.entity.ProjectEvent; +import org.springframework.data.domain.Page; + import java.util.List; /** @@ -17,20 +18,18 @@ public interface ProjectEventDao { ProjectEvent save(ProjectEvent projectEvent); /** - * Paging find by project. - * @param page - * @param size - * @param project - * @param typeId + * Save Batch. + * @param projectEvents * @return */ - List find(int page, int size, Project project, Integer typeId); + void saveBatch(List projectEvents); /** - * Count by project. - * @param project - * @param typeId + * find With Page + * @param page + * @param size + * @param projectId * @return */ - long count(Project project, Integer typeId); + Page findWithPage(int page, int size, Long projectId); } diff --git a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectLabelDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectLabelDao.java similarity index 100% rename from web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectLabelDao.java rename to core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectLabelDao.java diff --git a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectUserDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectUserDao.java similarity index 53% rename from web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectUserDao.java rename to core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectUserDao.java index 7b3c1e54..385d9e4b 100644 --- a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectUserDao.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/ProjectUserDao.java @@ -18,9 +18,12 @@ import com.webank.wedatasphere.qualitis.project.entity.Project; import com.webank.wedatasphere.qualitis.project.entity.ProjectUser; -import com.webank.wedatasphere.qualitis.query.queryqo.DataSourceQo; +import com.webank.wedatasphere.qualitis.project.queryqo.DataSourceQo; +import org.springframework.data.domain.Page; import java.util.List; +import java.util.Map; +import java.util.concurrent.CountDownLatch; /** * @author howeye @@ -46,15 +49,47 @@ public interface ProjectUserDao { * @param size * @return */ - List findByUsernameAndPermissionAndProjectType(String username, Integer projectType, int page, int size); + List findByUserNameAndProjectType(String username, Integer projectType, int page, int size); /** - * Find project user by userId + * Paging find project user by all permission. + * @param username + * @param projectType + * @return + */ + List findByUserNameAndProjectTypeWithOutPage(String username, Integer projectType); + + /** + * pagin find project user by advance conditions + * @param username + * @param projectType + * @param projectName + * @param subsystemName + * @param createUser + * @param db + * @param table + * @param startTime + * @param endTime + * @param page + * @param size + * @return + */ + Page findByAdvanceConditions(String username, Integer projectType, String projectName, String subsystemName, String createUser, String db, String table, Long startTime, Long endTime, int page, int size); + + /** + * Find project by user and permissions * @param username * @param permissions * @return */ - List findByUsernameAndPermission(String username, List permissions); + List findByUsernameAndPermissions(String username, List permissions); + + /** + * Find project by user + * @param username + * @return + */ + List> findProjectByUserName(String username); /** * Count project user by userId with project type @@ -71,15 +106,14 @@ public interface ProjectUserDao { * @param projectType * @return */ - Long countByUsernameAndPermissionAndProjectType(String username, Integer projectType); + Long countByUserNameAndProjectType(String username, Integer projectType); /** - * Count project user by userId + * Count project user by user name * @param username - * @param permissions * @return */ - Long countByUsernameAndPermission(String username, List permissions); + Long countProjectByUserName(String username); /** * Find by project user by project @@ -88,15 +122,6 @@ public interface ProjectUserDao { */ List findByProject(Project project); - /** - * Find project user by project with page. - * @param project - * @param page - * @param size - * @return - */ - List findByProjectPageable(Project project, int page, int size); - /** * Find project user by username and permissions * @param param @@ -122,4 +147,43 @@ public interface ProjectUserDao { * @param userName */ void deleteByProjectAndUserName(Project project, String userName); + + /** + * find User Name And Automatic + * + * @param userName + * @param flag + * @return + */ + List findUserNameAndAutomatic(String userName, Boolean flag); + + /** + * find User Name And Automatic + * + * @param userName + * @return + */ + List findByUserName(String userName); + + /** + * Delete all + * @param projectUserList + * @param countDownLatch + */ + void deleteInBatch(List projectUserList,CountDownLatch countDownLatch); + + /** + * batchInsert + * @param projectUserList + * @param countDownLatch + */ + void batchInsert(List projectUserList,CountDownLatch countDownLatch); + + /** + * Find projectUser by ids + * @param projectUserIds + * @return + */ + List findByIds(List projectUserIds); + } diff --git a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectDaoImpl.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectDaoImpl.java similarity index 88% rename from web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectDaoImpl.java rename to core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectDaoImpl.java index 6a929a73..26af2261 100644 --- a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectDaoImpl.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectDaoImpl.java @@ -19,7 +19,6 @@ import com.webank.wedatasphere.qualitis.project.dao.ProjectDao; import com.webank.wedatasphere.qualitis.project.dao.repository.ProjectRepository; import com.webank.wedatasphere.qualitis.project.entity.Project; -import com.webank.wedatasphere.qualitis.project.dao.ProjectDao; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.data.domain.PageRequest; import org.springframework.data.domain.Pageable; @@ -54,11 +53,16 @@ public Project saveProject(Project project) { @Override public List findAllProject(int page, int size) { - Sort sort = new Sort(Sort.Direction.ASC, "id"); + Sort sort = Sort.by(Sort.Direction.ASC, "id"); Pageable pageable = PageRequest.of(page, size, sort); return projectRepository.findAll(pageable).getContent(); } + @Override + public List findAll() { + return projectRepository.findAll(); + } + @Override public Long countAll() { return projectRepository.count(); @@ -78,4 +82,9 @@ public void deleteProject(Project project) { public List findByCreateUser(String createUser) { return projectRepository.findByCreateUser(createUser); } + + @Override + public List findAllById(List projectIds) { + return projectRepository.findAllById(projectIds); + } } diff --git a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectEventDaoImpl.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectEventDaoImpl.java similarity index 75% rename from web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectEventDaoImpl.java rename to core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectEventDaoImpl.java index ca4b49bc..7fadc2fa 100644 --- a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectEventDaoImpl.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectEventDaoImpl.java @@ -6,6 +6,7 @@ import com.webank.wedatasphere.qualitis.project.entity.ProjectEvent; import java.util.List; import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.domain.Page; import org.springframework.data.domain.PageRequest; import org.springframework.data.domain.Pageable; import org.springframework.data.domain.Sort; @@ -27,14 +28,16 @@ public ProjectEvent save(ProjectEvent projectEvent) { } @Override - public List find(int page, int size, Project project, Integer typeId) { - Sort sort = new Sort(Direction.DESC, "time"); - Pageable pageable = PageRequest.of(page, size, sort); - return projectEventRepository.findByProject(project, typeId, pageable).getContent(); + public void saveBatch(List projectEvents) { + projectEventRepository.saveAll(projectEvents); } @Override - public long count(Project project, Integer typeId) { - return projectEventRepository.countByProject(project, typeId); + public Page findWithPage(int page, int size, Long projectId) { + Sort sort = Sort.by(Direction.DESC, "id"); + Pageable pageable = PageRequest.of(page, size, sort); + return projectEventRepository.findByProjectId(projectId, pageable); } + + } diff --git a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectLabelDaoImpl.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectLabelDaoImpl.java similarity index 100% rename from web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectLabelDaoImpl.java rename to core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectLabelDaoImpl.java diff --git a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectUserDaoImpl.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectUserDaoImpl.java similarity index 50% rename from web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectUserDaoImpl.java rename to core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectUserDaoImpl.java index 110584a4..ecc3ff29 100644 --- a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectUserDaoImpl.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/impl/ProjectUserDaoImpl.java @@ -20,59 +20,95 @@ import com.webank.wedatasphere.qualitis.project.dao.repository.ProjectUserRepository; import com.webank.wedatasphere.qualitis.project.entity.Project; import com.webank.wedatasphere.qualitis.project.entity.ProjectUser; -import com.webank.wedatasphere.qualitis.query.queryqo.DataSourceQo; +import com.webank.wedatasphere.qualitis.project.queryqo.DataSourceQo; +import groovy.util.logging.Slf4j; import org.apache.commons.lang.StringUtils; +import org.slf4j.Logger; +import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; +import org.springframework.data.domain.Page; import org.springframework.data.domain.PageRequest; import org.springframework.data.domain.Pageable; import org.springframework.data.domain.Sort; +import org.springframework.data.domain.Sort.Direction; +import org.springframework.scheduling.annotation.Async; import org.springframework.stereotype.Repository; +import javax.persistence.EntityManager; +import javax.persistence.PersistenceContext; import javax.persistence.criteria.Predicate; import java.util.ArrayList; import java.util.List; +import java.util.Map; +import java.util.concurrent.CountDownLatch; /** * @author howeye */ @Repository +@Slf4j public class ProjectUserDaoImpl implements ProjectUserDao { + private static final Logger log = LoggerFactory.getLogger(ProjectUserDaoImpl.class); @Autowired private ProjectUserRepository projectUserRepository; + @PersistenceContext + private EntityManager entityManager; + + @Override public List findByUsernameAndPermissionAndProjectType(String username, Integer permission, Integer projectType, int page, int size) { - Sort sort = new Sort(Sort.Direction.ASC, "project"); + Sort sort = Sort.by(Sort.Direction.DESC, "project"); Pageable pageable = PageRequest.of(page, size, sort); - return projectUserRepository.findByUserNameAndPermissionAndProjectType(username, permission, projectType, pageable).getContent(); + return projectUserRepository.findByUsernameAndPermissionAndProjectType(username, permission, projectType, pageable).getContent(); + } + + @Override + public List findByUserNameAndProjectType(String username, Integer projectType, int page, int size) { + Sort sort = Sort.by(Direction.DESC, "id"); + Pageable pageable = PageRequest.of(page, size, sort); + return projectUserRepository.findByUserNameAndProjectType(username, projectType, pageable).getContent(); + } + + @Override + public List findByUserNameAndProjectTypeWithOutPage(String username, Integer projectType) { + return projectUserRepository.findByUserNameAndProjectTypeWithOutPage(username, projectType); } @Override - public List findByUsernameAndPermissionAndProjectType(String username, Integer projectType, int page, int size) { - Sort sort = new Sort(Sort.Direction.ASC, "id"); + public Page findByAdvanceConditions(String username, Integer projectType, String projectName, String subsystemName, String createUser, String db, String table, Long startTime, Long endTime, int page, int size) { + Sort sort = Sort.by(Direction.DESC, "id"); Pageable pageable = PageRequest.of(page, size, sort); - return projectUserRepository.findByUserNameAndPermissionAndProjectType(username, projectType, pageable).getContent(); + if (StringUtils.isNotEmpty(subsystemName)) { + subsystemName = "%" + subsystemName + "%"; + } + return projectUserRepository.findByAdvanceConditions(username, projectType, projectName, subsystemName, createUser, db, table, startTime, endTime, pageable); } @Override - public List findByUsernameAndPermission(String username, List permissions) { + public List findByUsernameAndPermissions(String username, List permissions) { return projectUserRepository.findByUserNameAndPermissions(username, permissions); } + @Override + public List> findProjectByUserName(String username) { + return projectUserRepository.findProjectByUserName(username); + } + @Override public Long countByUsernameAndPermissionAndProjectType(String username, Integer permission, Integer projectType) { return projectUserRepository.countByUserNameAndPermissionAndProjectType(username, permission, projectType); } @Override - public Long countByUsernameAndPermissionAndProjectType(String username, Integer projectType) { - return projectUserRepository.countByUserNameAndPermissionAndProjectType(username, projectType); + public Long countByUserNameAndProjectType(String username, Integer projectType) { + return projectUserRepository.countByUserNameAndProjectType(username, projectType); } @Override - public Long countByUsernameAndPermission(String username, List permissions) { - return projectUserRepository.countByUserNameAndPermission(username, permissions); + public Long countProjectByUserName(String username) { + return projectUserRepository.countProjectByUserName(username); } @Override @@ -80,13 +116,6 @@ public List findByProject(Project project) { return projectUserRepository.findByProject(project); } - @Override - public List findByProjectPageable(Project project, int page, int size) { - Sort sort = new Sort(Sort.Direction.ASC, "userName"); - Pageable pageable = PageRequest.of(page, size, sort); - return projectUserRepository.findByProject(project, pageable).getContent(); - } - @Override public List findByUsernameAndPermissionsIn(DataSourceQo param){ return projectUserRepository.findAll((root, query, cb) -> { @@ -116,6 +145,47 @@ public void deleteByProject(Project project) { @Override public void deleteByProjectAndUserName(Project project, String userName) { - projectUserRepository.deleteByProjectAndUserName(project, userName); + projectUserRepository.deleteByProjectAndUserName(project.getId(), userName); } + + @Override + public List findUserNameAndAutomatic(String userName, Boolean flag) { + return projectUserRepository.findUserNameAndAutomatic(userName, flag); + } + + @Override + public List findByUserName(String userName) { + return projectUserRepository.findByUserName(userName); + } + + + @Override + @Async("asyncServiceExecutor") + public void deleteInBatch(List projectUserList, CountDownLatch countDownLatch) { + try{ + log.warn("start executeAsync"); + projectUserRepository.deleteAll(projectUserList); + log.warn("end executeAsync"); + }finally { + countDownLatch.countDown(); + } + } + + @Override + @Async("asyncServiceExecutor") + public void batchInsert(List projectUserList,CountDownLatch countDownLatch) { + try{ + log.warn("start executeAsync"); + projectUserRepository.saveAll(projectUserList); + log.warn("end executeAsync"); + }finally { + countDownLatch.countDown(); + } + } + + @Override + public List findByIds(List projectUserIds) { + return projectUserRepository.findAllById(projectUserIds); + } + } diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectEventRepository.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectEventRepository.java new file mode 100644 index 00000000..b4ba9c15 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectEventRepository.java @@ -0,0 +1,22 @@ +package com.webank.wedatasphere.qualitis.project.dao.repository; + +import com.webank.wedatasphere.qualitis.project.entity.ProjectEvent; +import org.springframework.data.domain.Page; +import org.springframework.data.domain.Pageable; +import org.springframework.data.jpa.repository.JpaRepository; +import org.springframework.data.jpa.repository.JpaSpecificationExecutor; + +/** + * @author allenzhou + */ +public interface ProjectEventRepository extends JpaRepository, JpaSpecificationExecutor { + + /** + * find By Project + * @param projectId + * @param pageable + * @return + */ + Page findByProjectId(Long projectId, Pageable pageable); + +} diff --git a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectLabelRepository.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectLabelRepository.java similarity index 100% rename from web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectLabelRepository.java rename to core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectLabelRepository.java diff --git a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectRepository.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectRepository.java similarity index 100% rename from web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectRepository.java rename to core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectRepository.java diff --git a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectUserRepository.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectUserRepository.java similarity index 54% rename from web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectUserRepository.java rename to core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectUserRepository.java index 4261df0d..398ed7cc 100644 --- a/web/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectUserRepository.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/dao/repository/ProjectUserRepository.java @@ -18,10 +18,12 @@ import com.webank.wedatasphere.qualitis.project.entity.Project; import com.webank.wedatasphere.qualitis.project.entity.ProjectUser; +import java.util.Map; import org.springframework.data.domain.Page; import org.springframework.data.domain.Pageable; import org.springframework.data.jpa.repository.JpaRepository; import org.springframework.data.jpa.repository.JpaSpecificationExecutor; +import org.springframework.data.jpa.repository.Modifying; import org.springframework.data.jpa.repository.Query; import java.util.List; @@ -40,7 +42,7 @@ public interface ProjectUserRepository extends JpaRepository, * @return */ @Query("select pu from ProjectUser pu inner join pu.project p where pu.userName = ?1 and pu.permission = ?2 and p.projectType = ?3") - Page findByUserNameAndPermissionAndProjectType(String userName, Integer permission, Integer projectType, Pageable pageable); + Page findByUsernameAndPermissionAndProjectType(String userName, Integer permission, Integer projectType, Pageable pageable); /** * Paging find project user by userId @@ -50,10 +52,43 @@ public interface ProjectUserRepository extends JpaRepository, * @return */ @Query("select pu from ProjectUser pu inner join Project p on pu.project = p where pu.userName = ?1 and p.projectType = ?2 group by pu.project") - Page findByUserNameAndPermissionAndProjectType(String userName, Integer projectType, Pageable pageable); + Page findByUserNameAndProjectType(String userName, Integer projectType, Pageable pageable); /** - * Count by username and permission and project type + * Paging find project user by userId + * @param userName + * @param projectType + * @return + */ + @Query("select pu from ProjectUser pu inner join Project p on pu.project = p where pu.userName = ?1 and p.projectType = ?2 group by pu.project") + List findByUserNameAndProjectTypeWithOutPage(String userName, Integer projectType); + + /** + * find By Advance Conditions + * @param username + * @param projectType + * @param projectName + * @param subSystemName + * @param createUser + * @param db + * @param table + * @param startTime + * @param endTime + * @param pageable + * @return + */ + @Query(value = "select pu from Project p inner join ProjectUser pu on pu.project = p left join RuleDataSource ds on ds.projectId = p.id " + + "where pu.userName = ?1 and p.projectType = ?2 and p.name like ?3 and (?4 is null or p.subSystemName like ?4) " + + "and (?5 is null or p.createUser = ?5) and (?6 is null or ds.dbName = ?6) and (?7 is null or ds.tableName = ?7) " + + "and (?8 is null or UNIX_TIMESTAMP(p.createTime) >= ?8) and (?9 is null or UNIX_TIMESTAMP(p.createTime) < ?9) group by pu.project" + , countQuery = "select count(DISTINCT pu.project) from ProjectUser pu inner join Project p on pu.project = p left join RuleDataSource ds on ds.projectId = p.id " + + "where pu.userName = ?1 and p.projectType = ?2 and p.name like ?3 and (?4 is null or p.subSystemId = ?4) " + + "and (?5 is null or p.createUser = ?5) and (?6 is null or ds.dbName = ?6) and (?7 is null or ds.tableName = ?7) " + + "and (?8 is null or UNIX_TIMESTAMP(p.createTime) >= ?8) and (?9 is null or UNIX_TIMESTAMP(p.createTime) < ?9)") + Page findByAdvanceConditions(String username, Integer projectType, String projectName, String subSystemName, String createUser, String db, String table, Long startTime, Long endTime, Pageable pageable); + + /** + * Count by user name and permission and project type * @param userName * @param permission * @param projectType @@ -63,13 +98,22 @@ public interface ProjectUserRepository extends JpaRepository, Long countByUserNameAndPermissionAndProjectType(String userName, Integer permission, Integer projectType); /** - * Count by username and permissions and project type + * Count by user name and project type * @param userName * @param projectType * @return */ @Query("select count(DISTINCT pu.project) from ProjectUser pu inner join Project p on pu.project = p where pu.userName = ?1 and p.projectType = ?2") - Long countByUserNameAndPermissionAndProjectType(String userName, Integer projectType); + Long countByUserNameAndProjectType(String userName, Integer projectType); + + /** + * Find project user by userId + * @param userName + * @param permission + * @return + */ + @Query("select DISTINCT pu.project from ProjectUser pu inner join pu.project p where pu.userName = ?1 and pu.permission = ?2") + List findByUserNameAndPermission(String userName, Integer permission); /** * Count by username and permissions @@ -81,20 +125,20 @@ public interface ProjectUserRepository extends JpaRepository, Long countByUserNameAndPermission(String userName, List permissions); /** - * Find project user by username + * Find project user by user name * @param userName * @return */ - List findByUserName(String userName); + @Query("select DISTINCT new map(p.name as project_name, p.id as project_id) from ProjectUser pu inner join pu.project p where pu.userName = ?1") + List> findProjectByUserName(String userName); /** - * Find project user by userId + * Count by username * @param userName - * @param permission * @return */ - @Query("select DISTINCT pu.project from ProjectUser pu inner join pu.project p where pu.userName = ?1 and pu.permission = ?2") - List findByUserNameAndPermission(String userName, Integer permission); + @Query("select count(DISTINCT pu.project) from ProjectUser pu inner join pu.project p where pu.userName = ?1") + Long countProjectByUserName(String userName); /** * Find project user by userId @@ -113,10 +157,13 @@ public interface ProjectUserRepository extends JpaRepository, /** * Delete project user by project and user name - * @param project + * + * @param projectId * @param userName */ - void deleteByProjectAndUserName(Project project, String userName); + @Modifying + @Query(value = "delete from qualitis_project_user where project_id = ?1 and user_name = ?2", nativeQuery = true) + void deleteByProjectAndUserName(Long projectId, String userName); /** * Find project user by project @@ -125,6 +172,13 @@ public interface ProjectUserRepository extends JpaRepository, */ List findByProject(Project project); + /** + * Find project user by username + * @param userName + * @return + */ + List findByUserName(String userName); + /** * Find project user by project with page. * @param project @@ -132,4 +186,14 @@ public interface ProjectUserRepository extends JpaRepository, * @return */ Page findByProject(Project project, Pageable pageable); + + /** + * find User Name And Automatic + * + * @param userName + * @param flag + * @return + */ + @Query("select pu from ProjectUser pu where pu.userName = ?1 and pu.automaticSwitch = ?2") + List findUserNameAndAutomatic(String userName, Boolean flag); } diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/Project.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/Project.java index 48d9d78e..4e4bdad4 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/Project.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/Project.java @@ -21,6 +21,8 @@ import javax.persistence.*; import java.util.Set; +import org.apache.commons.lang.StringUtils; +import org.codehaus.jackson.annotate.JsonIgnore; /** * @author howeye @@ -33,20 +35,19 @@ public class Project { @Id @GeneratedValue(strategy = GenerationType.IDENTITY) private Long id; - @Column(length = 170) + @Column(length = 128) private String name; - @Column(name = "cn_name") + @Column(name = "cn_name", length = 128) private String cnName; @Column(length = 1700) private String description; - @OneToMany(mappedBy = "project", cascade = {CascadeType.REMOVE, CascadeType.PERSIST}) + @OneToMany(mappedBy = "project", cascade = {CascadeType.REMOVE, CascadeType.PERSIST}, fetch = FetchType.EAGER) + @JsonIgnore private Set projectUsers; - @OneToMany(mappedBy = "project", cascade = {CascadeType.REMOVE, CascadeType.PERSIST}) - private Set projectEvents; - @OneToMany(mappedBy = "project", cascade = {CascadeType.REMOVE, CascadeType.PERSIST}, fetch = FetchType.EAGER) + @JsonIgnore private Set projectLabels; @Column(name = "create_user", length = 50) @@ -54,17 +55,28 @@ public class Project { /** * Full name, such as: Tom(chinese_name) */ - @Column(name = "create_user_full_name", length = 50) + @Column(name = "create_user_full_name", length = 100) private String createUserFullName; - @Column(name = "user_department", length = 50) - private String userDepartment; + @Column(name = "department", length = 50) + private String department; + + @Column(name = "sub_system_id") + private Long subSystemId; + + @Column(name = "sub_system_name") + private String subSystemName; @Column(name = "create_time", length = 25) private String createTime; @Column(name = "modify_user", length = 50) private String modifyUser; + /** + * Full name, such as: Tom(chinese_name) + */ + @Column(name = "modify_user_full_name", length = 100) + private String modifyUserFullName; @Column(name = "modify_time", length = 25) private String modifyTime; @@ -80,8 +92,8 @@ public Project(String projectName, String cnName, String description, String use this.cnName = cnName; this.description = description; this.createUser = username; - this.createUserFullName = username + "(" + chineseName + ")"; - this.userDepartment = department; + this.createUserFullName = username + "(" + (StringUtils.isNotEmpty(chineseName) ? chineseName : "") + ")"; + this.department = department; this.createTime = createTime; } @@ -149,12 +161,12 @@ public void setCreateUserFullName(String createUserFullName) { this.createUserFullName = createUserFullName; } - public String getUserDepartment() { - return userDepartment; + public String getDepartment() { + return department; } - public void setUserDepartment(String userDepartment) { - this.userDepartment = userDepartment; + public void setDepartment(String department) { + this.department = department; } public Integer getProjectType() { @@ -181,6 +193,14 @@ public void setModifyUser(String modifyUser) { this.modifyUser = modifyUser; } + public String getModifyUserFullName() { + return modifyUserFullName; + } + + public void setModifyUserFullName(String modifyUserFullName) { + this.modifyUserFullName = modifyUserFullName; + } + public String getModifyTime() { return modifyTime; } @@ -189,12 +209,20 @@ public void setModifyTime(String modifyTime) { this.modifyTime = modifyTime; } - public Set getProjectEvents() { - return projectEvents; + public Long getSubSystemId() { + return subSystemId; + } + + public void setSubSystemId(Long subSystemId) { + this.subSystemId = subSystemId; + } + + public String getSubSystemName() { + return subSystemName; } - public void setProjectEvents(Set projectEvents) { - this.projectEvents = projectEvents; + public void setSubSystemName(String subSystemName) { + this.subSystemName = subSystemName; } @Override @@ -202,10 +230,17 @@ public String toString() { return "Project{" + "id=" + id + ", name='" + name + '\'' + + ", cnName='" + cnName + '\'' + ", description='" + description + '\'' + ", createUser='" + createUser + '\'' + ", createUserFullName='" + createUserFullName + '\'' + - ", userDepartment='" + userDepartment + '\'' + + ", userDepartment='" + department + '\'' + + ", subSystemId=" + subSystemId + + ", subSystemName='" + subSystemName + '\'' + + ", createTime='" + createTime + '\'' + + ", modifyUser='" + modifyUser + '\'' + + ", modifyTime='" + modifyTime + '\'' + + ", projectType=" + projectType + '}'; } } diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/ProjectEvent.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/ProjectEvent.java index ecdf8d27..01bbe315 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/ProjectEvent.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/ProjectEvent.java @@ -19,58 +19,28 @@ public class ProjectEvent { @Id @GeneratedValue(strategy = GenerationType.IDENTITY) private Long id; - - @ManyToOne - private Project project; - - @Column(name = "content", length = 500) + @Column(name = "project_id") + private Long projectId; + @Column(name = "content", columnDefinition = "MEDIUMTEXT") private String content; - @Column(name = "field", length = 50) - private String field; - @Column(name = "before_modify", length = 200) - private String beforeModify; - @Column(name = "after_modify", length = 200) - private String afterModify; - @Column(name = "modify_user", length = 50) - private String modifyUser; - @Column(name = "execute_user", length = 50) - private String executeUser; - @Column(name = "time", length = 25) private String time; - - @Column(name = "event_type") - private Integer eventType; + @Column(name = "operate_type") + private Integer operateType; + @Column(name = "operate_user") + private String operateUser; public ProjectEvent() { } - public ProjectEvent(Project project, String content, String time) { - this.project = project; + public ProjectEvent(Project project, String operateUser, String content, String time, Integer operateType) { + this.operateUser = operateUser; + this.operateType = operateType; + this.projectId = project.getId(); this.content = content; this.time = time; } - public ProjectEvent(Project project, String executeUser, String content, String time, Integer eventType) { - this.executeUser = executeUser; - this.eventType = eventType; - this.project = project; - this.content = content; - this.time = time; - } - - public ProjectEvent(Project projectInDb, String userName, String field, String beforeModify, String afterModify, String time, Integer eventType) { - this.beforeModify = beforeModify; - this.afterModify = afterModify; - this.eventType = eventType; - this.project = projectInDb; - this.modifyUser = userName; - this.field = field; - this.time = time; - - this.content = userName + " modified " + field; - } - public Long getId() { return id; } @@ -79,12 +49,12 @@ public void setId(Long id) { this.id = id; } - public Project getProject() { - return project; + public Long getProjectId() { + return projectId; } - public void setProject(Project project) { - this.project = project; + public void setProjectId(Long projectId) { + this.projectId = projectId; } public String getContent() { @@ -95,46 +65,6 @@ public void setContent(String content) { this.content = content; } - public String getField() { - return field; - } - - public void setField(String field) { - this.field = field; - } - - public String getBeforeModify() { - return beforeModify; - } - - public void setBeforeModify(String beforeModify) { - this.beforeModify = beforeModify; - } - - public String getAfterModify() { - return afterModify; - } - - public void setAfterModify(String afterModify) { - this.afterModify = afterModify; - } - - public String getModifyUser() { - return modifyUser; - } - - public void setModifyUser(String modifyUser) { - this.modifyUser = modifyUser; - } - - public String getExecuteUser() { - return executeUser; - } - - public void setExecuteUser(String executeUser) { - this.executeUser = executeUser; - } - public String getTime() { return time; } @@ -143,12 +73,20 @@ public void setTime(String time) { this.time = time; } - public Integer getEventType() { - return eventType; + public Integer getOperateType() { + return operateType; + } + + public void setOperateType(Integer operateType) { + this.operateType = operateType; + } + + public String getOperateUser() { + return operateUser; } - public void setEventType(Integer eventType) { - this.eventType = eventType; + public void setOperateUser(String operateUser) { + this.operateUser = operateUser; } @Override @@ -161,14 +99,14 @@ public boolean equals(Object o) { } ProjectEvent that = (ProjectEvent) o; return Objects.equals(id, that.id) && - Objects.equals(project, that.project) && + Objects.equals(projectId, that.projectId) && Objects.equals(content, that.content) && Objects.equals(time, that.time) && - Objects.equals(eventType, that.eventType); + Objects.equals(operateType, that.operateType); } @Override public int hashCode() { - return Objects.hash(id, project, content, time, eventType); + return Objects.hash(id, projectId, content, time, operateType); } } diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/ProjectLabel.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/ProjectLabel.java index 2da30b00..988dd301 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/ProjectLabel.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/ProjectLabel.java @@ -21,12 +21,12 @@ import java.util.Objects; import javax.persistence.Column; import javax.persistence.Entity; -import javax.persistence.FetchType; import javax.persistence.GeneratedValue; import javax.persistence.GenerationType; -import javax.persistence.Id; import javax.persistence.ManyToOne; import javax.persistence.Table; +import javax.persistence.Id; +import org.codehaus.jackson.annotate.JsonIgnore; /** * @author allenzhou @@ -41,6 +41,7 @@ public class ProjectLabel { private Long id; @ManyToOne + @JsonIgnore private Project project; @Column(name = "label_name", length = 20) diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/ProjectUser.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/ProjectUser.java index 22f307a6..690d067d 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/ProjectUser.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/entity/ProjectUser.java @@ -18,8 +18,10 @@ import com.fasterxml.jackson.annotation.JsonIdentityInfo; import com.fasterxml.jackson.annotation.ObjectIdGenerators; + import javax.persistence.*; import java.util.Objects; +import org.codehaus.jackson.annotate.JsonIgnore; /** * @author howeye @@ -34,6 +36,7 @@ public class ProjectUser { private Long id; @ManyToOne + @JsonIgnore private Project project; private Integer permission; @@ -44,20 +47,25 @@ public class ProjectUser { @Column(name = "user_full_name", length = 30) private String userFullName; + @Column(name = "automatic_switch") + private Boolean automaticSwitch; + public ProjectUser() { } - public ProjectUser(Integer permission, Project project, String userName) { + public ProjectUser(Integer permission, Project project, String userName, Boolean flag) { this.permission = permission; this.project = project; this.userName = userName; + this.automaticSwitch = flag; } - public ProjectUser(Integer permission, Project project, String userName, String userFullName) { + public ProjectUser(Integer permission, Project project, String userName, String userFullName, Boolean flag) { this.permission = permission; this.project = project; this.userName = userName; this.userFullName = userFullName; + this.automaticSwitch = flag; } public Long getId() { @@ -84,28 +92,40 @@ public void setPermission(Integer permission) { this.permission = permission; } - public String getUserName() { - return userName; - } + public String getUserName() { + return userName; + } - public void setUserName(String userName) { - this.userName = userName; - } + public void setUserName(String userName) { + this.userName = userName; + } - public String getUserFullName() { - return userFullName; - } + public String getUserFullName() { + return userFullName; + } - public void setUserFullName(String userFullName) { - this.userFullName = userFullName; - } + public void setUserFullName(String userFullName) { + this.userFullName = userFullName; + } + + public Boolean getAutomaticSwitch() { + return automaticSwitch; + } - @Override + public void setAutomaticSwitch(Boolean automaticSwitch) { + this.automaticSwitch = automaticSwitch; + } + + @Override public boolean equals(Object o) { - if (this == o) {return true;} - if (o == null || getClass() != o.getClass()) {return false;} + if (this == o) { + return true; + } + if (o == null || getClass() != o.getClass()) { + return false; + } ProjectUser that = (ProjectUser) o; - return Objects.equals(id, that.id); + return Objects.equals(id, that.id) && Objects.equals(userName, that.userName); } @Override @@ -116,7 +136,6 @@ public int hashCode() { @Override public String toString() { return "ProjectUser{" + - "id=" + id + ", project=" + project.getId() + ", permission=" + permission + ", userName='" + userName + '\'' + diff --git a/web/project/src/main/java/com/webank/wedatasphere/qualitis/query/queryqo/DataSourceQo.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/queryqo/DataSourceQo.java similarity index 92% rename from web/project/src/main/java/com/webank/wedatasphere/qualitis/query/queryqo/DataSourceQo.java rename to core/project/src/main/java/com/webank/wedatasphere/qualitis/project/queryqo/DataSourceQo.java index 324ff327..b039cd11 100644 --- a/web/project/src/main/java/com/webank/wedatasphere/qualitis/query/queryqo/DataSourceQo.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/queryqo/DataSourceQo.java @@ -14,9 +14,9 @@ * limitations under the License. */ -package com.webank.wedatasphere.qualitis.query.queryqo; +package com.webank.wedatasphere.qualitis.project.queryqo; -import com.webank.wedatasphere.qualitis.query.request.RuleQueryRequest; +import com.webank.wedatasphere.qualitis.project.request.RuleQueryRequest; import org.springframework.beans.BeanUtils; diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/request/RuleQueryRequest.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/request/RuleQueryRequest.java new file mode 100644 index 00000000..5ffb9d5c --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/project/request/RuleQueryRequest.java @@ -0,0 +1,262 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.project.request; + +import com.fasterxml.jackson.annotation.JsonProperty; +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; +import org.apache.commons.lang3.StringUtils; + +/** + * + * @author v_wblwyan + * @date 2018-11-1 + */ +public class RuleQueryRequest { + + private String cluster; + @JsonProperty("datasource_id") + private Long datasourceId; + private String db; + private String table; + private String column; + private String user; + private Integer datasourceType; + @JsonProperty("rule_template_id") + private Long ruleTemplateId; + /** + * 1 == table, 2 == column + */ + @JsonProperty("relation_object_type") + private Integer relationObjectType; + @JsonProperty("sub_system_id") + private Long subSystemId; + @JsonProperty("department_name") + private String departmentName; + @JsonProperty("dev_department_name") + private String devDepartmentName; + @JsonProperty("tag_code") + private String tagCode; + private String envName; + + @JsonProperty("column_name") + private String fieldName; + @JsonProperty("data_type") + private String dataType; + @JsonProperty("is_primary") + private Boolean isPrimary; + @JsonProperty("is_partition") + private Boolean isPartitionField; + + private int page; + private int size; + + public RuleQueryRequest() { + this.page = 0; + this.size = 10; + } + + public Integer getRelationObjectType() { + return relationObjectType; + } + + public void setRelationObjectType(Integer relationObjectType) { + this.relationObjectType = relationObjectType; + } + + public RuleQueryRequest(String user) { + this.user = user; + } + + public String getCluster() { + return cluster; + } + + public void setCluster(String cluster) { + this.cluster = cluster; + } + + public Long getDatasourceId() { + return datasourceId; + } + + public void setDatasourceId(Long datasourceId) { + this.datasourceId = datasourceId; + } + + public String getDb() { + return db; + } + + public void setDb(String db) { + this.db = db; + } + + public String getTable() { + return table; + } + + public void setTable(String table) { + this.table = table; + } + + public String getColumn() { + return column; + } + + public void setColumn(String column) { + this.column = column; + } + + public String getUser() { + return user; + } + + public void setUser(String user) { + this.user = user; + } + + public Integer getDatasourceType() { + return datasourceType; + } + + public void setDatasourceType(Integer datasourceType) { + this.datasourceType = datasourceType; + } + + public int getPage() { + return page; + } + + public void setPage(int page) { + this.page = page; + } + + public int getSize() { + return size; + } + + public void setSize(int size) { + this.size = size; + } + + public Long getRuleTemplateId() { + return ruleTemplateId; + } + + public void setRuleTemplateId(Long ruleTemplateId) { + this.ruleTemplateId = ruleTemplateId; + } + + public String getFieldName() { + return fieldName; + } + + public void setFieldName(String fieldName) { + this.fieldName = fieldName; + } + + public String getDataType() { + return dataType; + } + + public void setDataType(String dataType) { + this.dataType = dataType; + } + + public Boolean getPrimary() { + return isPrimary; + } + + public void setPrimary(Boolean primary) { + isPrimary = primary; + } + + public Boolean getPartitionField() { + return isPartitionField; + } + + public void setPartitionField(Boolean partitionField) { + isPartitionField = partitionField; + } + + public Long getSubSystemId() { + return subSystemId; + } + + public void setSubSystemId(Long subSystemId) { + this.subSystemId = subSystemId; + } + + public String getDepartmentName() { + return departmentName; + } + + public void setDepartmentName(String departmentName) { + this.departmentName = departmentName; + } + + public String getDevDepartmentName() { + return devDepartmentName; + } + + public void setDevDepartmentName(String devDepartmentName) { + this.devDepartmentName = devDepartmentName; + } + + public String getTagCode() { + return tagCode; + } + + public void setTagCode(String tagCode) { + this.tagCode = tagCode; + } + + public String getEnvName() { + return envName; + } + + public void setEnvName(String envName) { + this.envName = envName; + } + + public void checkRequest() throws UnExpectedRequestException { + if (cluster == null || "".equals(cluster) + || table == null || "".equals(table)) { + throw new UnExpectedRequestException("Params of {&REQUEST_CAN_NOT_BE_NULL}"); + } + } + + public void convertParameter() { + this.cluster = StringUtils.trimToNull(this.cluster); + this.column = StringUtils.trimToNull(this.column); + this.db = StringUtils.trimToNull(this.db); + this.table = StringUtils.trimToNull(this.table); + this.tagCode = StringUtils.trimToNull(this.tagCode); + this.envName = StringUtils.trimToNull(this.envName); + if (StringUtils.isEmpty(this.departmentName)) { + this.departmentName = null; + } else { + this.departmentName = "%"+this.departmentName+"%"; + } + if (StringUtils.isEmpty(this.devDepartmentName)) { + this.devDepartmentName = null; + } else { + this.devDepartmentName = "%"+this.devDepartmentName+"%"; + } + } + +} \ No newline at end of file diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/CheckTemplateEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/CheckTemplateEnum.java index 6dc92594..f7290872 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/CheckTemplateEnum.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/CheckTemplateEnum.java @@ -16,9 +16,6 @@ package com.webank.wedatasphere.qualitis.rule.constant; -import java.util.Arrays; -import java.util.List; - /** * Enum in checkTemplate of AlarmConfig * @author howeye @@ -28,30 +25,27 @@ public enum CheckTemplateEnum { * Monthly, weekly, day and fixed name * Ring growth, year on year */ - MONTH_FLUCTUATION(1,"月波动", "Month Fluctuation", Arrays.asList(Number.class)), - WEEK_FLUCTUATION(2,"周波动", "Week Fluctuation", Arrays.asList(Number.class)), - DAY_FLUCTUATION(3,"日波动", "Daily Fluctuation", Arrays.asList(Number.class)), - FIXED_VALUE(4,"固定值", "Fix Value", Arrays.asList(Number.class)), - FULL_YEAR_RING_GROWTH(5,"年环比", "Full Year Ring Growth", Arrays.asList(Number.class)), - HALF_YEAR_GROWTH(6,"半年环比", "Half Year Ring Growth", Arrays.asList(Number.class)), - SEASON_RING_GROWTH(7,"季环比", "Season Ring Growth", Arrays.asList(Number.class)), - MONTH_RING_GROWTH(8,"月环比", "Month Ring Growth", Arrays.asList(Number.class)), - WEEK_RING_GROWTH(9,"周环比", "Week Ring Growth", Arrays.asList(Number.class)), - DAY_RING_GROWTH(10,"日环比", "Day Ring Growth", Arrays.asList(Number.class)), - HOUR_RING_GROWTH(11,"时环比", "Hour Ring Growth", Arrays.asList(Number.class)), - YEAR_ON_YEAR(12,"月同比", "YEAR ON YEAR", Arrays.asList(Number.class)), - ; + MONTH_FLUCTUATION(1,"月波动", "Month Fluctuation"), + WEEK_FLUCTUATION(2,"周波动", "Week Fluctuation"), + DAY_FLUCTUATION(3,"日波动", "Daily Fluctuation"), + FIXED_VALUE(4,"固定值", "Fix Value"), + FULL_YEAR_RING_GROWTH(5,"年环比", "Full Year Ring Growth"), + HALF_YEAR_GROWTH(6,"半年环比", "Half Year Ring Growth"), + SEASON_RING_GROWTH(7,"季环比", "Season Ring Growth"), + MONTH_RING_GROWTH(8,"月环比", "Month Ring Growth"), + WEEK_RING_GROWTH(9,"周环比", "Week Ring Growth"), + DAY_RING_GROWTH(10,"日环比", "Day Ring Growth"), + HOUR_RING_GROWTH(11,"时环比", "Hour Ring Growth"), + YEAR_ON_YEAR(12,"月同比", "YEAR ON YEAR"); private Integer code; private String zhMessage; private String enMessage; - private List classes; - CheckTemplateEnum(Integer code, String zhMessage, String enMessage, List classes) { + CheckTemplateEnum(Integer code, String zhMessage, String enMessage) { this.code = code; this.zhMessage = zhMessage; this.enMessage = enMessage; - this.classes = classes; } public Integer getCode() { @@ -66,10 +60,6 @@ public String getEnMessage() { return enMessage; } - public List getClasses() { - return classes; - } - public static String getCheckTemplateName(Integer code) { for (CheckTemplateEnum c : CheckTemplateEnum.values()) { if (c.getCode().equals(code)) { diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/ContrastTypeEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/ContrastTypeEnum.java new file mode 100644 index 00000000..834fa20b --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/ContrastTypeEnum.java @@ -0,0 +1,84 @@ +package com.webank.wedatasphere.qualitis.rule.constant; + +import com.google.common.collect.Maps; +import org.apache.commons.lang.StringUtils; + +import java.util.ArrayList; +import java.util.List; +import java.util.Map; + +/** + * @author v_gaojiedeng@webank.com + */ +public enum ContrastTypeEnum { + + /** + * 单向 (分析右表与左表不一致的数据,left join),双向(分析两边不一致的数据,full outer join) + * Constrast Type + */ + LEFT_JOIN(1, "分析右表与左表不一致的数据", "LEFT JOIN"), + FULL_OUTER_JOIN(2, "分析两边不一致的数据", "FULL OUTER JOIN"); + + private Integer code; + private String message; + private String joinType; + + ContrastTypeEnum(Integer code, String message, String joinType) { + this.code = code; + this.message = message; + this.joinType = joinType; + } + + public Integer getCode() { + return code; + } + + public String getMessage() { + return message; + } + + public String getJoinType() { + return joinType; + } + + public static String getJoinType(Integer code) { + if (code == null) { + return ContrastTypeEnum.LEFT_JOIN.getJoinType(); + } + for (ContrastTypeEnum contrastTypeEnum : ContrastTypeEnum.values()) { + if (contrastTypeEnum.getCode().equals(code)) { + return contrastTypeEnum.getJoinType(); + } + } + return ContrastTypeEnum.LEFT_JOIN.getJoinType(); + } + + public static Integer getCode(String joinType) { + if (StringUtils.isEmpty(joinType)) { + return ContrastTypeEnum.LEFT_JOIN.getCode(); + } + + for (ContrastTypeEnum contrastTypeEnum : ContrastTypeEnum.values()) { + if (contrastTypeEnum.getJoinType().equals(joinType.toUpperCase())) { + return contrastTypeEnum.getCode(); + } + } + + return ContrastTypeEnum.LEFT_JOIN.getCode(); + } + + public static List> getConstrastEnumList() { + List> list = new ArrayList>(); + for (ContrastTypeEnum contrastTypeEnum : ContrastTypeEnum.values()) { + Map item = Maps.newHashMap(); + item.put("code", contrastTypeEnum.code); + item.put("message", contrastTypeEnum.message); + item.put("join_type", contrastTypeEnum.joinType); + list.add(item); + + } + return list; + } + + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/FileOutputNameEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/FileOutputNameEnum.java index 0b9a91e2..12d9a2af 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/FileOutputNameEnum.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/FileOutputNameEnum.java @@ -16,7 +16,6 @@ package com.webank.wedatasphere.qualitis.rule.constant; -import java.util.Arrays; import java.util.List; /** @@ -27,20 +26,17 @@ public enum FileOutputNameEnum { /** * file count, dir size. */ - FILE_COUNT(1,"文件数", "file count", Arrays.asList(Number.class)), - DIR_SIZE(2,"文件目录大小", "dir size", Arrays.asList(Number.class)) - ; + FILE_COUNT(1,"文件数", "file count"), + DIR_SIZE(2,"文件目录大小", "dir size"); private Integer code; private String zhMessage; private String enMessage; - private List classes; - FileOutputNameEnum(Integer code, String zhMessage, String enMessage, List classes) { + FileOutputNameEnum(Integer code, String zhMessage, String enMessage) { this.code = code; this.zhMessage = zhMessage; this.enMessage = enMessage; - this.classes = classes; } public Integer getCode() { @@ -55,10 +51,6 @@ public String getEnMessage() { return enMessage; } - public List getClasses() { - return classes; - } - public static String getFileOutputName(Integer code, String local) { for (FileOutputNameEnum c : FileOutputNameEnum.values()) { if (c.getCode().equals(code)) { diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/FileOutputUnitEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/FileOutputUnitEnum.java index 75e36f53..6113dc1f 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/FileOutputUnitEnum.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/FileOutputUnitEnum.java @@ -27,21 +27,18 @@ public enum FileOutputUnitEnum { /** * TB, GB, MB, KB */ - TB(1,"TB", Arrays.asList(Number.class)), - GB(2,"GB", Arrays.asList(Number.class)), - MB(3,"MB", Arrays.asList(Number.class)), - KB(4,"KB", Arrays.asList(Number.class)), - B(5,"B", Arrays.asList(Number.class)) - ; + TB(1,"TB"), + GB(2,"GB"), + MB(3,"MB"), + KB(4,"KB"), + B(5,"B"); private Integer code; private String message; - private List classes; - FileOutputUnitEnum(Integer code, String message, List classes) { + FileOutputUnitEnum(Integer code, String message) { this.code = code; this.message = message; - this.classes = classes; } public Integer getCode() { @@ -52,10 +49,6 @@ public String getMessage() { return message; } - public List getClasses() { - return classes; - } - public static String fileOutputUnit(Integer code) { for (FileOutputUnitEnum c : FileOutputUnitEnum.values()) { if (c.getCode().equals(code)) { diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/MappingTypeEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/MappingTypeEnum.java new file mode 100644 index 00000000..7182a508 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/MappingTypeEnum.java @@ -0,0 +1,31 @@ +package com.webank.wedatasphere.qualitis.rule.constant; + +/** + * @author v_gaojiedeng@webank.com + */ +public enum MappingTypeEnum { + + /** + * Mapping type, including Mapping using CONNECT and MATCHING FIELDS + */ + CONNECT_FIELDS(1, "连接字段"), + MATCHING_FIELDS(2, "对比字段") + ; + + private Integer code; + private String message; + + MappingTypeEnum(Integer code, String message) { + this.code = code; + this.message = message; + } + + public Integer getCode() { + return code; + } + + public String getMessage() { + return message; + } + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/NoiseStrategyEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/NoiseStrategyEnum.java new file mode 100644 index 00000000..e87317e4 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/NoiseStrategyEnum.java @@ -0,0 +1,57 @@ +package com.webank.wedatasphere.qualitis.rule.constant; + +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +/** + * @author v_gaojiedeng@webank.com + */ +public enum NoiseStrategyEnum { + + /** + * 只告警不阻断、不告警不阻断 + */ + ALARM_ONLY_NO_BLOCKING(1, "只告警不阻断"), + NO_ALARM_NO_BLOCKING(2, "不告警不阻断"); + + private Integer code; + private String message; + + NoiseStrategyEnum(Integer code, String message) { + this.code = code; + this.message = message; + } + + public Integer getCode() { + return code; + } + + public String getMessage() { + return message; + } + + public static List> getNoiseStrategyEnumList() { + List> list = new ArrayList>(); + for (NoiseStrategyEnum noiseStrategyEnum : NoiseStrategyEnum.values()) { + Map item = new HashMap<>(); + item.put("code", noiseStrategyEnum.code); + item.put("message", noiseStrategyEnum.message); + list.add(item); + + } + return list; + } + + public static String getNoiseStrategyMessage(Integer code) { + for (NoiseStrategyEnum noiseStrategyEnum : NoiseStrategyEnum.values()) { + if (noiseStrategyEnum.getCode().equals(code)) { + return noiseStrategyEnum.getMessage(); + } + } + return null; + } + + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/RoleSystemTypeEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/RoleSystemTypeEnum.java new file mode 100644 index 00000000..17d8acb8 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/RoleSystemTypeEnum.java @@ -0,0 +1,54 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.rule.constant; + +/** + * @author allenzhou + */ +public enum RoleSystemTypeEnum { + /** + * type + */ + ADMIN(1, "ADMIN"), + PROJECTOR(2, "PROJECTOR"), + DEPARTMENT_ADMIN(0, "DEPARTMENT_ADMIN"); + + private Integer code; + private String message; + + RoleSystemTypeEnum(Integer code, String message) { + this.code = code; + this.message = message; + } + + public Integer getCode() { + return code; + } + + public String getMessage() { + return message; + } + + public static RoleSystemTypeEnum fromCode(Integer roleType) { + for (RoleSystemTypeEnum roleSystemTypeEnum: RoleSystemTypeEnum.values()) { + if (roleSystemTypeEnum.code.equals(roleType)) { + return roleSystemTypeEnum; + } + } + return null; + } +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/StandardApproveEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/StandardApproveEnum.java new file mode 100644 index 00000000..8e1c761a --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/StandardApproveEnum.java @@ -0,0 +1,48 @@ +package com.webank.wedatasphere.qualitis.rule.constant; + +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +/** + * @author v_gaojiedeng@webank.com + */ +public enum StandardApproveEnum { + /** + * 数据地图, ITSM, 其他 + */ + DATA_MAP(1, "数据地图"), + ITSM(2, "ITSM"), + OTHER(3, "其他"), + ; + + private Integer code; + private String message; + + StandardApproveEnum(Integer code, String message) { + this.code = code; + this.message = message; + } + + public Integer getCode() { + return code; + } + + public String getMessage() { + return message; + } + + public static List> getStandardEnumList() { + List> list = new ArrayList>(); + for (StandardApproveEnum standardApproveEnum : StandardApproveEnum.values()) { + Map item = new HashMap<>(); + item.put("code", standardApproveEnum.code); + item.put("message", standardApproveEnum.message); + list.add(item); + + } + return list; + } + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/StandardVauleEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/StandardVauleEnum.java new file mode 100644 index 00000000..bc26f38e --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/StandardVauleEnum.java @@ -0,0 +1,31 @@ +package com.webank.wedatasphere.qualitis.rule.constant; + +/** + * @author v_gaojiedeng@webank.com + */ +public enum StandardVauleEnum { + /** + * level + */ + DEFAULT_TEMPLATE(1, "内置模版"), + DEPARTMENT_TEMPLATE(2, "部门模版"), + PERSONAL_TEMPLATE(3, "个人模版"); + + private Integer code; + private String message; + + StandardVauleEnum(Integer code, String message) { + this.code = code; + this.message = message; + } + + public Integer getCode() { + return code; + } + + public String getMessage() { + return message; + } + + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TableDataTypeEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TableDataTypeEnum.java new file mode 100644 index 00000000..04a2ea32 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TableDataTypeEnum.java @@ -0,0 +1,35 @@ +package com.webank.wedatasphere.qualitis.rule.constant; + +/** + * @author v_minminghe@webank.com + * @date 2022-09-14 10:21 + * @description + */ +public enum TableDataTypeEnum { + /** + * ruleMetric 指标管理 + * standardValue 标准值 + * ruleTemplate 规则模板 + * linkisUdf Linkis UDF + * linkisDataSource Linkis数据源 + */ + RULE_METRIC("ruleMetric", "指标管理"), + RULE_TEMPLATE("ruleTemplate", "规则模板"), + LINKIS_DATA_SOURCE("linkisDataSource", "Linkis数据源"); + + private String code; + private String desc; + + TableDataTypeEnum(String code, String desc) { + this.code = code; + this.desc = desc; + } + + public String getCode() { + return code; + } + + public String getDesc() { + return desc; + } +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateActionTypeEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateActionTypeEnum.java index 38178569..5b07bdec 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateActionTypeEnum.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateActionTypeEnum.java @@ -21,12 +21,13 @@ */ public enum TemplateActionTypeEnum { /** - * SQL, Java, Scala, Python + * SQL, Java, Scala, Python,METADATA(元数据接口,前端写死) */ SQL(1, "SQL"), JAVA(2, "Java"), SCALA(3, "Scala"), PYTHON(4, "Python"), + METADATA(5, "Metadata"), ; private Integer code; diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateCheckLevelEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateCheckLevelEnum.java new file mode 100644 index 00000000..cd7e3b02 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateCheckLevelEnum.java @@ -0,0 +1,58 @@ +package com.webank.wedatasphere.qualitis.rule.constant; + +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +/** + * @author v_gaojiedeng@webank.com + */ +public enum TemplateCheckLevelEnum { + + /** + * 表级,字段级 + */ + TABLE_LEVEL(1, "表级"), + FIELD_LEVEL(2, "字段级"), + ; + + private Integer code; + private String message; + + TemplateCheckLevelEnum(Integer code, String message) { + this.code = code; + this.message = message; + } + + public Integer getCode() { + return code; + } + + public String getMessage() { + return message; + } + + public static List> getTemplateCheckLevelList() { + List> list = new ArrayList>(); + for (TemplateCheckLevelEnum templateCheckLevelEnum : TemplateCheckLevelEnum.values()) { + Map item = new HashMap<>(); + item.put("code", templateCheckLevelEnum.code); + item.put("message", templateCheckLevelEnum.message); + list.add(item); + + } + return list; + } + + public static String getMessage(Integer code) { + for (TemplateCheckLevelEnum templateCheckLevelEnum : TemplateCheckLevelEnum.values()) { + if (templateCheckLevelEnum.getCode().equals(code)) { + return templateCheckLevelEnum.getMessage(); + } + } + return ""; + } + + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateCheckTypeEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateCheckTypeEnum.java new file mode 100644 index 00000000..c902e779 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateCheckTypeEnum.java @@ -0,0 +1,57 @@ +package com.webank.wedatasphere.qualitis.rule.constant; + +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +/** + * @author v_gaojiedeng@webank.com + */ +public enum TemplateCheckTypeEnum { + + /** + * 固定值 + */ + Fixed_value(1, "固定值"), + ; + + private Integer code; + private String message; + + TemplateCheckTypeEnum(Integer code, String message) { + this.code = code; + this.message = message; + } + + public Integer getCode() { + return code; + } + + public String getMessage() { + return message; + } + + public static List> getTemplateCheckTypeList() { + List> list = new ArrayList>(); + for (TemplateCheckTypeEnum templateCheckTypeEnum : TemplateCheckTypeEnum.values()) { + Map item = new HashMap<>(); + item.put("code", templateCheckTypeEnum.code); + item.put("message", templateCheckTypeEnum.message); + list.add(item); + + } + return list; + } + + public static String getMessage(Integer code) { + for (TemplateCheckTypeEnum templateCheckTypeEnum : TemplateCheckTypeEnum.values()) { + if (templateCheckTypeEnum.getCode().equals(code)) { + return templateCheckTypeEnum.getMessage(); + } + } + return ""; + } + + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateDataSourceTypeEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateDataSourceTypeEnum.java index 8c126da5..bc02754e 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateDataSourceTypeEnum.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateDataSourceTypeEnum.java @@ -16,6 +16,8 @@ package com.webank.wedatasphere.qualitis.rule.constant; +import org.apache.commons.lang3.StringUtils; + /** * @author howeye */ @@ -51,15 +53,19 @@ public static String getMessage(Integer code) { return templateDataSourceTypeEnum.getMessage(); } } - return TemplateDataSourceTypeEnum.HIVE.getMessage(); + return null; } public static Integer getCode(String message) { + String dataSourceTypeName = null; + if (StringUtils.isNotBlank(message)) { + dataSourceTypeName = message.toLowerCase(); + } for (TemplateDataSourceTypeEnum templateDataSourceTypeEnum : TemplateDataSourceTypeEnum.values()) { - if (templateDataSourceTypeEnum.getMessage().toLowerCase().equals(message)) { + if (templateDataSourceTypeEnum.getMessage().toLowerCase().equals(dataSourceTypeName)) { return templateDataSourceTypeEnum.getCode(); } } - return TemplateDataSourceTypeEnum.HIVE.getCode(); + return null; } } diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateFileTypeEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateFileTypeEnum.java new file mode 100644 index 00000000..611b3296 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateFileTypeEnum.java @@ -0,0 +1,73 @@ +package com.webank.wedatasphere.qualitis.rule.constant; + +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + +/** + * @author v_gaojiedeng@webank.com + */ +public enum TemplateFileTypeEnum { + + /** + * -- 目录容量 目录文件数 分区数 + */ + DIRECTORY_CAPACITY(1, "目录容量"), + NUMBER_CATALOG_FILES(2, "目录文件数"), + NUMBER_PARTITIONS(3, "分区数"); + + private Integer code; + private String message; + + TemplateFileTypeEnum(Integer code, String message) { + this.code = code; + this.message = message; + } + + public static List> getTemplateFileTypeList() { + List> list = new ArrayList>(); + for (TemplateFileTypeEnum templateFileTypeEnum : TemplateFileTypeEnum.values()) { + Map item = new HashMap<>(); + item.put("code", templateFileTypeEnum.code); + item.put("message", templateFileTypeEnum.message); + list.add(item); + + } + return list; + } + + public static String getTemplateFileTypeByCode(Integer code) { + for (TemplateFileTypeEnum e : TemplateFileTypeEnum.values()) { + if (code.equals(e.getCode())) { + return e.getMessage(); + } + } + return null; + } + + public static Integer getCode(String message) { + for (TemplateFileTypeEnum templateFileTypeEnum : TemplateFileTypeEnum.values()) { + if (templateFileTypeEnum.getMessage().equals(message)) { + return templateFileTypeEnum.getCode(); + } + } + return null; + } + + public Integer getCode() { + return code; + } + + public void setCode(Integer code) { + this.code = code; + } + + public String getMessage() { + return message; + } + + public void setMessage(String message) { + this.message = message; + } +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateInputTypeEnum.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateInputTypeEnum.java index c9fcdde6..4a6555fa 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateInputTypeEnum.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/constant/TemplateInputTypeEnum.java @@ -16,6 +16,12 @@ package com.webank.wedatasphere.qualitis.rule.constant; +import com.google.common.collect.Maps; +import java.util.ArrayList; +import java.util.HashMap; +import java.util.List; +import java.util.Map; + /** * Enum in TemplateInputType of RuleTemplate * @author howeye @@ -24,52 +30,116 @@ public enum TemplateInputTypeEnum { /** * Enum in TemplateInputType of RuleTemplate */ - FIXED_VALUE(1, "固定值"), - TABLE(3, "数据表"), - FIELD(4, "字段"), - DATABASE(5, "数据库"), - FIELD_CONCAT(6, "字段拼接"), - REGEXP(7, "正则"), - LIST(8, "数组"), - CONDITION(9, "条件"), + FIXED_VALUE(1, "固定值","Fixed value"), + TABLE(3, "数据表","Data table"), + FIELD(4, "校验字段","Field"), + DATABASE(5, "数据库","Database"), + FIELD_CONCAT(6, "字段拼接","Field concat"), + REGEXP(7, "正则","Regexp"), + LIST(8, "枚举值","List"), + CONDITION(9, "基础过滤条件","Condition"), /** * Provided for multi-table verification template */ - AND_CONCAT(10, "AND拼接"), - SOURCE_DB(11, "来源数据库"), - SOURCE_TABLE(12, "来源表"), - TARGET_DB(13, "目标数据库"), - TARGET_TABLE(14, "目标表"), - LEFT_STATEMENT(15, "join左表达式"), - OPERATION(16, "join操作符"), - RIGHT_STATEMENT(17, "join右表达式"), - SOURCE_FIELD(18, "join左字段"), - TARGET_FIELD(19, "join右字段"), - FRONT_CONDITION(20, "前置条件"), - BEHIND_CONDITION(21, "后置条件"), - SOURCE_FIELDS(22, "来源字段"), - TARGET_FIELDS(23, "目标字段"), + AND_CONCAT(10, "AND拼接","And concat"), + SOURCE_DB(11, "源数据库","Source db"), + SOURCE_TABLE(12, "源数据表","Source table"), + TARGET_DB(13, "目标数据库","Target db"), + TARGET_TABLE(14, "目标数据表","Target table"), + LEFT_STATEMENT(15, "join左表达式","Left statement"), + OPERATION(16, "join操作符","Operation"), + RIGHT_STATEMENT(17, "join右表达式","Right statement"), + SOURCE_FIELD(18, "join左字段","Source field"), + TARGET_FIELD(19, "join右字段","Target field"), + FRONT_CONDITION(20, "前置条件","Front condition"), + BEHIND_CONDITION(21, "后置条件","Behind condition"), + SOURCE_FIELDS(22, "来源字段","Source fields"), + TARGET_FIELDS(23, "目标字段","Target fields"), /** * Provided for primary line repeat */ - FIELD_REPLACE_NULL_CONCAT(24, "替换空字段拼接"), + FIELD_REPLACE_NULL_CONCAT(24, "替换空字段拼接","Field replace null concat"), + CONTRAST_TYPE(25, "比对方向","Contrast type"), + + VALUE_RANGE(28, "数值范围","Value range"), + EXPRESSION(29, "表达式","Expression"), + SOURCE_BASIC_FILTER_CONDITIONS(30, "源基础过滤条件","Source basic filter conditions"), + TARGET_BASIC_FILTER_CONDITIONS(31, "目标基础过滤条件","Target basic filter conditions"), + CONNECT_FIELDS(32, "连接字段设置","Connect fields"), + COMPARISON_FIELD_SETTINGS(33, "比对字段设置","Comparison field settings"), + COMPARISON_RESULTS_FOR_FILTER(34, "比对结果过滤条件","comparison results for filter"), + + FILTER_BY(35, "筛选方式","Filter by"), + + + //最大值、最小值 表达式 + MAXIMUM(36, "最大值","Maximum"), + INTERMEDIATE_EXPRESSION(37, "中间表达式","Intermediate expression"), + MINIMUM(38, "最小值","Minimum"), + STANDARD_VALUE_EXPRESSION(39, "标准值表达式","standard value expression") ; private Integer code; - private String message; + private String cnMessage; + private String enMessage; - TemplateInputTypeEnum(Integer code, String message) { + TemplateInputTypeEnum(Integer code, String cnMessage, String enMessage) { this.code = code; - this.message = message; + this.cnMessage = cnMessage; + this.enMessage = enMessage; + } + + public static Map getTemplateData(Integer type) { + for (TemplateInputTypeEnum c : TemplateInputTypeEnum.values()) { + if (c.getCode().equals(type)) { + Map item = new HashMap<>(); + item.put("code", c.code); + item.put("cnMessage", c.cnMessage); + item.put("enMessage", c.enMessage); + return item; + } + } + return Maps.newHashMap(); + } + + + public static List> getTemplateInputName(Integer code) { + List> hashMaps = new ArrayList<>(); + for (TemplateInputTypeEnum c : TemplateInputTypeEnum.values()) { + if (c.getCode().equals(code)) { + Map item = new HashMap<>(); + item.put("code", c.code); + item.put("cnMessage", c.cnMessage); + item.put("enMessage", c.enMessage); + hashMaps.add(item); + } + } + return hashMaps; } public Integer getCode() { return code; } - public String getMessage() { - return message; + public void setCode(Integer code) { + this.code = code; + } + + public String getCnMessage() { + return cnMessage; + } + + public void setCnMessage(String cnMessage) { + this.cnMessage = cnMessage; + } + + public String getEnMessage() { + return enMessage; + } + + public void setEnMessage(String enMessage) { + this.enMessage = enMessage; } -} +} \ No newline at end of file diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/AlarmArgumentsExecutionParametersDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/AlarmArgumentsExecutionParametersDao.java new file mode 100644 index 00000000..d204a9ec --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/AlarmArgumentsExecutionParametersDao.java @@ -0,0 +1,42 @@ +package com.webank.wedatasphere.qualitis.rule.dao; + +import com.webank.wedatasphere.qualitis.rule.entity.AlarmArgumentsExecutionParameters; +import com.webank.wedatasphere.qualitis.rule.entity.ExecutionParameters; + +import java.util.List; +import java.util.Set; + +/** + * @author v_gaojiedeng@webank.com + */ +public interface AlarmArgumentsExecutionParametersDao { + + /** + * Find AlarmArgumentsExecutionParameters by executionParameters + * @param executionParameters + * @return + */ + List findByExecutionParameters(ExecutionParameters executionParameters); + + /** + * Find AlarmArgumentsExecutionParameters by id + * @param alarmArgumentsExecutionParametersId + * @return + */ + AlarmArgumentsExecutionParameters findById(Long alarmArgumentsExecutionParametersId); + + /** + * Save all AlarmArgumentsExecutionParameters . + * @param alarmArgumentsExecutionParameters + * @return + */ + Set saveAll(List alarmArgumentsExecutionParameters); + + /** + * Delete AlarmArgumentsExecutionParameters by ExecutionParameters + * @param executionParametersInDb + */ + void deleteByExecutionParameters(ExecutionParameters executionParametersInDb); + + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/AlarmConfigDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/AlarmConfigDao.java index fba67a88..b655fba0 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/AlarmConfigDao.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/AlarmConfigDao.java @@ -16,6 +16,7 @@ package com.webank.wedatasphere.qualitis.rule.dao; +import com.webank.wedatasphere.qualitis.entity.RuleMetric; import com.webank.wedatasphere.qualitis.rule.entity.AlarmConfig; import java.util.List; @@ -32,4 +33,11 @@ public interface AlarmConfigDao { */ List saveAllAlarmConfig(List alarmConfigs); + /** + * Get by rule metric + * @param ruleMetric + * @return + */ + List getByRuleMetric(RuleMetric ruleMetric); + } diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/BdpClientHistoryDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/BdpClientHistoryDao.java new file mode 100644 index 00000000..269ba88a --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/BdpClientHistoryDao.java @@ -0,0 +1,55 @@ +/* + * Copyright 2019 WeBank + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ + +package com.webank.wedatasphere.qualitis.rule.dao; + + +import com.webank.wedatasphere.qualitis.rule.entity.BdpClientHistory; + +/** + * @author allenzhou + */ +public interface BdpClientHistoryDao { + + /** + * Find histroy by rule id + * @param ruleId + * @return + */ + BdpClientHistory findByRuleId(Long ruleId); + + /** + * Save rule history + * @param bdpClientHistory + * @return + */ + BdpClientHistory save(BdpClientHistory bdpClientHistory); + + /** + * Delete + * @param bdpClientHistory + */ + void delete(BdpClientHistory bdpClientHistory); + + /** + * Find create rule history by templateFunction and datasource and project name + * @param templateFuncName + * @param s + * @param s1 + * @return + */ + BdpClientHistory findByTemplateFunctionAndDatasourceAndProjectName(String templateFuncName, String s, String s1); +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/DataVisibilityDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/DataVisibilityDao.java new file mode 100644 index 00000000..472f7b1a --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/DataVisibilityDao.java @@ -0,0 +1,43 @@ +package com.webank.wedatasphere.qualitis.rule.dao; + +import com.webank.wedatasphere.qualitis.rule.entity.DataVisibility; + +import java.util.List; + +/** + * @author v_minminghe@webank.com + * @date 2022-09-14 9:44 + * @description + */ +public interface DataVisibilityDao { + + /** + * Save batch DataVisibility + * @param dataVisibilityList + */ + void saveAll(List dataVisibilityList); + + /** + * Delete DataVisibility by tableDataId and tableDataType + * @param tableDataId + * @param tableDataType + */ + void delete(Long tableDataId, String tableDataType); + + /** + * Filter by tableDataId and tableDataType + * @param tableDataId + * @param tableDataType + * @return + */ + List findByTableDataIdAndTableDataType(Long tableDataId, String tableDataType); + + /** + * Filter by tableDataIds and tableDataType + * @param tableDataIds + * @param tableDataType + * @return + */ + List findByTableDataIdsAndTableDataType(List tableDataIds, String tableDataType); + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/ExecutionParametersDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/ExecutionParametersDao.java new file mode 100644 index 00000000..f39d1173 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/ExecutionParametersDao.java @@ -0,0 +1,87 @@ +package com.webank.wedatasphere.qualitis.rule.dao; + +import com.webank.wedatasphere.qualitis.rule.entity.ExecutionParameters; + +import java.util.List; + +/** + * @author v_gaojiedeng + */ +public interface ExecutionParametersDao { + + /** + * find All ExecutionParameters + * @param projectId + * @param name + * @param page + * @param size + * @return + */ + List findAllExecutionParameters(Long projectId, String name, int page, int size); + + /** + * find All + * @param projectId + * @param page + * @param size + * @return + */ + List findAll(Long projectId, int page, int size); + + /** + * find By Id + * @param executionParametersId + * @return + */ + ExecutionParameters findById(Long executionParametersId); + + /** + * save ExecutionParameters + * @param executionParameters + * @return + */ + ExecutionParameters saveExecutionParameters(ExecutionParameters executionParameters); + + /** + * delete ExecutionParameters + * @param executionParameters + */ + void deleteExecutionParameters(ExecutionParameters executionParameters); + + /** + * count Total + * @param projectId + * @return + */ + int countTotal(Long projectId); + + /** + * count Total By Name + * @param projectId + * @param name + * @return + */ + int countTotalByName(Long projectId, String name); + + /** + * find By Name And ProjectId + * @param name + * @param projectId + * @return + */ + ExecutionParameters findByNameAndProjectId(String name, Long projectId); + + /** + * get All ExecutionParameters + * @param projectId + * @return + */ + List getAllExecutionParameters(Long projectId); + + /** + * select All executionParameters + * @return + */ + List selectAllExecutionParameters(); + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/ExecutionVariableDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/ExecutionVariableDao.java new file mode 100644 index 00000000..812bf157 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/ExecutionVariableDao.java @@ -0,0 +1,52 @@ +package com.webank.wedatasphere.qualitis.rule.dao; + +import com.webank.wedatasphere.qualitis.rule.entity.ExecutionParameters; +import com.webank.wedatasphere.qualitis.rule.entity.ExecutionVariable; + +import java.util.List; +import java.util.Set; + +/** + * @author v_gaojiedeng@webank.com + */ +public interface ExecutionVariableDao { + + /** + * Find ExecutionVariable by executionParameters + * @param executionParameters + * @return + */ + List findByExecutionParameters(ExecutionParameters executionParameters); + + /** + * Find ExecutionVariable by id + * @param executionVariableId + * @return + */ + ExecutionVariable findById(Long executionVariableId); + + /** + * Save all ExecutionVariable . + * @param executionVariables + * @return + */ + Set saveAll(List executionVariables); + + /** + * Delete StaticExecutionParameters by ExecutionParameters + * @param executionParametersInDb + */ + void deleteByExecutionParameters(ExecutionParameters executionParametersInDb); + + /** + * select Mate ExecutionVariable + * @param type + * @param name + * @param value + * @param executionParametersInDb + * @return + */ + List selectMateExecutionVariable(Integer type,String name,String value,ExecutionParameters executionParametersInDb); + + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/LinkisDataSourceDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/LinkisDataSourceDao.java new file mode 100644 index 00000000..1b001c02 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/LinkisDataSourceDao.java @@ -0,0 +1,69 @@ +package com.webank.wedatasphere.qualitis.rule.dao; + +import com.webank.wedatasphere.qualitis.rule.entity.LinkisDataSource; +import org.springframework.data.domain.Page; + +import java.util.List; + +/** + * @author v_minminghe@webank.com + * @date 2022-09-16 9:57 + * @description + */ +public interface LinkisDataSourceDao { + + /** + * save + * @param linkisDataSource + */ + void save(LinkisDataSource linkisDataSource); + + /** + * get By Linkis Data Source Id + * @param dataSourceId + * @return + */ + LinkisDataSource getByLinkisDataSourceId(Long dataSourceId); + + /** + * get By Linkis Data Source Id + * @param dataSourceName + * @return + */ + LinkisDataSource getByLinkisDataSourceName(String dataSourceName); + + /** + * get By Linkis Data Source Ids + * @param dataSourceIds + * @return + */ + List getByLinkisDataSourceIds(List dataSourceIds); + + /** + * filter + * @param dataSourceName + * @param dataSourceTypeId + * @param dataVisibilityDeptList + * @param createUser + * @param searchCreateUser + * @param searchModifyUser + * @param subSystemName + * @param devDepartmentId + * @param opsDepartmentId + * @param ignoreDataAuthorityCondition 忽视对数据权限的查询条件限制,一般用于管理员 + * @param searchDataVisibilityDeptList + * @param page + * @param size + * @return + */ + Page filterWithPage(String dataSourceName, Long dataSourceTypeId, List dataVisibilityDeptList, String createUser + , String searchCreateUser, String searchModifyUser, String subSystemName, Long devDepartmentId, Long opsDepartmentId + , Boolean ignoreDataAuthorityCondition, List searchDataVisibilityDeptList, int page, int size); + + /** + * get all datasource name + * @return + */ + List getAllDataSourceNameList(); + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/NamingConventionsDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/NamingConventionsDao.java new file mode 100644 index 00000000..bc55fa7f --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/NamingConventionsDao.java @@ -0,0 +1,17 @@ +package com.webank.wedatasphere.qualitis.rule.dao; + +import com.webank.wedatasphere.qualitis.rule.entity.NamingConventions; + +import java.util.List; + +/** + * @author v_gaojiedeng@webank.com + */ +public interface NamingConventionsDao { + + /** + * find All + * @return + */ + List findAll(); +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/NoiseEliminationManagementDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/NoiseEliminationManagementDao.java new file mode 100644 index 00000000..540c2be2 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/NoiseEliminationManagementDao.java @@ -0,0 +1,42 @@ +package com.webank.wedatasphere.qualitis.rule.dao; + +import com.webank.wedatasphere.qualitis.rule.entity.ExecutionParameters; +import com.webank.wedatasphere.qualitis.rule.entity.NoiseEliminationManagement; + +import java.util.List; +import java.util.Set; + +/** + * @author v_gaojiedeng@webank.com + */ +public interface NoiseEliminationManagementDao { + + /** + * Find NoiseEliminationManagement by executionParameters + * @param executionParameters + * @return + */ + List findByExecutionParameters(ExecutionParameters executionParameters); + + /** + * Find NoiseEliminationManagement by id + * @param noiseEliminationManagementId + * @return + */ + NoiseEliminationManagement findById(Long noiseEliminationManagementId); + + /** + * Save all NoiseEliminationManagement . + * @param noiseEliminationManagement + * @return + */ + Set saveAll(List noiseEliminationManagement); + + /** + * Delete NoiseEliminationManagement by ExecutionParameters + * @param executionParametersInDb + */ + void deleteByExecutionParameters(ExecutionParameters executionParametersInDb); + + +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDao.java index 61cf71e1..973c7f5e 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDao.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDao.java @@ -19,9 +19,11 @@ import com.webank.wedatasphere.qualitis.project.entity.Project; import com.webank.wedatasphere.qualitis.rule.entity.Rule; import com.webank.wedatasphere.qualitis.rule.entity.RuleGroup; - import com.webank.wedatasphere.qualitis.rule.entity.Template; +import org.springframework.data.domain.Page; + import java.util.List; +import java.util.Map; /** * @author howeye @@ -30,6 +32,7 @@ public interface RuleDao { /** * Count by project + * * @param project * @return */ @@ -37,6 +40,7 @@ public interface RuleDao { /** * Find rule by project + * * @param project * @return */ @@ -44,6 +48,7 @@ public interface RuleDao { /** * Find by project with page. + * * @param project * @param page * @param size @@ -52,24 +57,74 @@ public interface RuleDao { List findByProjectWithPage(Project project, int page, int size); /** - * Find by rule metric ID with page. - * @param ruleMetricId + * Find all rule id, name by project + * + * @param project + * @return + */ + List> findSpecialInfoByProject(Project project); + + /** + * find rules by project and some conditions + * + * @param project + * @param ruleName + * @param ruleCnName + * @param ruleTemplateId + * @param db + * @param table + * @param ruleEnable + * @param createUser + * @param modifyUser + * @param startCreateTime + * @param endCreateTime + * @param startModifyTime + * @param endModifyTime + * @param ruleGroupName + * @param workFlowSpace + * @param workFlowProject + * @param workFlowName + * @param nodeName * @param page * @param size * @return */ -// List findByRuleMetricWithPage(Long ruleMetricId, int page, int size); + Page findByConditionWithPage(Project project, String ruleName, String ruleCnName, Integer ruleTemplateId, String db, String table, Boolean ruleEnable + , String createUser, String modifyUser, String startCreateTime, String endCreateTime, String startModifyTime, String endModifyTime, String ruleGroupName, + String workFlowSpace, String workFlowProject, String workFlowName, String nodeName, int page, int size); /** - * Find rule by project + * Find rule by project and rule name + * * @param project * @param ruleName * @return */ Rule findByProjectAndRuleName(Project project, String ruleName); + /** + * Find rule by workflow name and project + * + * @param project + * @param workflowName + * @param ruleName + * @return + */ + Rule findHighestVersionByProjectAndWorkFlowName(Project project, String workflowName, String ruleName); + + /** + * Find rule by project and rule names + * + * @param project + * @param workflowName + * @param ruleNames + * @return + */ + List findByProjectAndWorkflowNameAndRuleNames(Project project, String workflowName, List ruleNames); + /** * Save rule + * * @param rule * @return */ @@ -77,6 +132,7 @@ public interface RuleDao { /** * Find rule by id + * * @param ruleId * @return */ @@ -84,34 +140,200 @@ public interface RuleDao { /** * Delete rule + * * @param rule */ void deleteRule(Rule rule); /** - * Delete all rules - * @param rules + * Delete by id + * + * @param ruleId */ - void deleteAllRule(List rules); + void deleteById(Long ruleId); /** * Find rules by ids + * * @param ruleIds * @return */ List findByIds(List ruleIds); + /** + * Count rule list by rule group + * + * @param ruleGroup + * @return + */ + int countByRuleGroup(RuleGroup ruleGroup); + /** * Find rule list by rule group + * * @param ruleGroup * @return */ List findByRuleGroup(RuleGroup ruleGroup); + /** + * Find rule list by rule group with page + * + * @param page + * @param size + * @param ruleGroup + * @param templateId + * @param name + * @param cnName + * @param cols + * @param ruleType + * @return + */ + List findByRuleGroupWithPage(int page, int size, RuleGroup ruleGroup, Long templateId, String name, String cnName, List cols, Integer ruleType); + + /** + * count by rule group with page + * + * @param ruleGroup + * @param templateId + * @param name + * @param cnName + * @param cols + * @param ruleType + * @return + */ + Long countByRuleGroupWithPage(RuleGroup ruleGroup, Long templateId, String name, String cnName, List cols, Integer ruleType); + + /** + * find By Rule Group And File Out Name With Page + * + * @param page + * @param size + * @param ruleGroup + * @param templateId + * @param name + * @param cnName + * @param cols + * @param ruleType + * @return + */ + List findByRuleGroupAndFileOutNameWithPage(int page, int size, RuleGroup ruleGroup, Long templateId, String name, String cnName, List cols, Integer ruleType); + + /** + * count By Rule Group And File Out Name + * + * @param ruleGroup + * @param templateId + * @param name + * @param cnName + * @param cols + * @param ruleType + * @return + */ + Long countByRuleGroupAndFileOutName(RuleGroup ruleGroup, Long templateId, String name, String cnName, List cols, Integer ruleType); + /** * Find rule by rule template + * * @param templateInDb * @return */ List findByTemplate(Template templateInDb); + + /** + * Find rule by projectId,name + * + * @param projectId + * @param name + * @return + */ + List getDeployExecutionParameters(Long projectId, String name); + + /** + * Paging Rule by datasource + * + * @param clusterName + * @param dbName + * @param tableName + * @param colName + * @param user + * @param ruleTemplateId + * @param relationObjectType + * @param page + * @param size + * @return + */ + Page findRuleByDataSource(String clusterName, String dbName, String tableName, String colName, String user, Long ruleTemplateId, Integer relationObjectType, int page, int size); + + /** + * Count by ruleName and projectId + * + * @param ruleName + * @param projectId + * @return + */ + int countByProjectAndRuleName(String ruleName, Long projectId); + + /** + * select mate rule by ruleName workFlowName workFlowVersion + * + * @param ruleName + * @param workFlowName + * @param workFlowVersion + * @param projectId + * @return + */ + Long selectMateRule(String ruleName, String workFlowName, String workFlowVersion, Long projectId); + + /** + * find MinWorkFlowVersion + * + * @param ruleName + * @param projectId + * @return + */ + Rule findMinWorkFlowVersionRule(String ruleName, Long projectId); + + /** + * find All By Id + * + * @param ruleIds + * @return + */ + List findAllById(List ruleIds); + + /** + * save Rules + * + * @param rules + * @return + */ + List saveRules(List rules); + + /** + * find Exist Standard Vaule + * + * @param templateId + * @param projectId + * @return + */ + List findExistStandardVaule(Long templateId, Long projectId); + + /** + * find Custom Rule Type By Project + * + * @param ruleType + * @param projectId + * @return + */ + List findCustomRuleTypeByProject(Integer ruleType, Long projectId); + + /** + * find Work Flow Filed + * + * @param projectId + * @return + */ + List> findWorkFlowFiled(Long projectId); + } diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDataSourceCountDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDataSourceCountDao.java index a39a696b..deb54b84 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDataSourceCountDao.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDataSourceCountDao.java @@ -17,6 +17,8 @@ package com.webank.wedatasphere.qualitis.rule.dao; import com.webank.wedatasphere.qualitis.rule.entity.RuleDataSourceCount; +import java.util.List; +import java.util.Set; /** * @author allenzhou @@ -38,4 +40,10 @@ public interface RuleDataSourceCountDao { */ RuleDataSourceCount save(RuleDataSourceCount ruleDataSourceCount); + /** + * Save in batch + * @param ruleDataSourceCountSet + * @return + */ + List saveAll(Set ruleDataSourceCountSet); } diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDataSourceDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDataSourceDao.java index a76a85a4..80ee6afb 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDataSourceDao.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDataSourceDao.java @@ -18,6 +18,8 @@ import com.webank.wedatasphere.qualitis.rule.entity.Rule; import com.webank.wedatasphere.qualitis.rule.entity.RuleDataSource; +import com.webank.wedatasphere.qualitis.rule.entity.RuleGroup; +import org.springframework.data.domain.Page; import java.util.List; import java.util.Map; @@ -28,29 +30,31 @@ public interface RuleDataSourceDao { /** - * Save all rule datasource + * Save all rule datasource. * @param ruleDataSources * @return */ List saveAllRuleDataSource(List ruleDataSources); /** - * Find rule datasource by rule - * @param rule + * Find rule datasource by project id for add multi db rules. + * @param projectId * @return */ - List findByRule(Rule rule); + List findByProjectId(Long projectId); /** - * Find rule datasource by project id - * @param projectId + * Find only cols' name. + * @param user + * @param clusterName + * @param dbName + * @param tableName * @return */ - List findByProjectId(Long projectId); - + List findColsByUser(String user, String clusterName, String dbName, String tableName); /** - * Find rule datasoruce by project id and datasource(cluster, db, table) + * Find rule datasoruce by project id and datasource(cluster, db, table). * @param projectId * @param cluster * @param db @@ -73,52 +77,57 @@ public interface RuleDataSourceDao { * @param size * @return */ - List> findProjectDsByUser(String user, int page, int size); - - /** - * Find rules related with cluster name, database name ,table name, column name. - * @param clusterName - * @param dbName - * @param tableName - * @param colName - * @param user - * @return - */ - List findRuleByDataSource(String clusterName, String dbName, String tableName, String colName, String user); + List> findProjectDsByUserPage(String user, int page, int size); + + /** + * find Column By DataSource + * @param clusterName + * @param dbName + * @param tableName + * @param colName + * @param user + * @param page + * @param size + * @return + */ + List findColumnByDataSource(String clusterName, String dbName, String tableName, String colName, String user, int page, int size); /** - * Paging rules related with cluster name, database name ,table name, column name. + * count Rule Count By Group * @param clusterName * @param dbName - * @param tableName - * @param colName + * @param tableNames * @param user - * @param page - * @param size * @return */ - List findRuleByDataSource(String clusterName, String dbName, String tableName, String colName, String user, int page, int size); + List> countRuleCountByGroup(List clusterName, List dbName, List tableNames, String user); /** - * Count. + * count Rule Count By Group * @param clusterName * @param dbName - * @param tableName - * @param colName - * @param user + * @param tableNames + * @param fieldNames + * @param users * @return */ - int countRuleByDataSource(String clusterName, String dbName, String tableName, String colName, String user); + List> countRuleCountByGroup(List clusterName, List dbName, List tableNames, List fieldNames, List users); /** - * Filter rule datasource + * Count rule datasource. * @param user * @param clusterName * @param dbName * @param tableName + * @param datasourceType + * @param subSystemId + * @param departmentCode + * @param devDepartmentName + * @param tagCode + * @param envName * @return */ - List> filterProjectDsByUser(String user, String clusterName, String dbName, String tableName); + long countProjectDsByUser(String user, String clusterName, String dbName, String tableName, Integer datasourceType, Long subSystemId, String departmentCode, String devDepartmentName, String tagCode, String envName); /** * Filter rule datasource pageable. @@ -126,11 +135,18 @@ public interface RuleDataSourceDao { * @param clusterName * @param dbName * @param tableName + * @param datasourceType + * @param subSystemId + * @param departmentName + * @param devDepartmentName + * @param tagCode + * @param envName * @param page * @param size * @return */ - List> filterProjectDsByUserPage(String user, String clusterName, String dbName, String tableName, int page, int size); + List> filterProjectDsByUserPage(String user, String clusterName, String dbName, String tableName, + Integer datasourceType, Long subSystemId, String departmentName, String devDepartmentName, String tagCode, String envName, int page, int size); /** * Save rule datasource @@ -140,22 +156,97 @@ public interface RuleDataSourceDao { RuleDataSource saveRuleDataSource(RuleDataSource newRuleDataSource); /** - * Find cols' name. + * Find all datasources by user for datasource execution. * @param user * @param clusterName * @param dbName * @param tableName * @return */ - List findColsByUser(String user, String clusterName, String dbName, String tableName); + List findDatasourcesByUser(String user, String clusterName, String dbName, String tableName); /** - * Find all datasources by user. - * @param user + * Find rule create users for rule count. * @param clusterName * @param dbName * @param tableName + * @param userName * @return */ - List findDatasourcesByUser(String user, String clusterName, String dbName, String tableName); + List findRuleCreateUserByDataSource(String clusterName, String dbName, String tableName, String userName); + + /** + * Paging datasources + * @param page + * @param size + * @return + */ + Page findAllWithPage(int page, int size); + + /** + * find all tagCode and tagName in Table + * @param loginUser + * @return + */ + List findAllTagByUser(String loginUser); + + /** + * find By Rule Id + * @param ruleIds + * @return + */ + List findByRuleId(List ruleIds); + + /** + * findRuleGroupIds + * @param projectId + * @param dbName + * @param tableName + * @return + */ + List findRuleGroupIds(Long projectId, String dbName, String tableName); + + /** + * update linkidDataSourceName + * @param linkisDataSourceId + * @param linkisDataSourceName + */ + void updateLinkisDataSourceName(Long linkisDataSourceId, String linkisDataSourceName); + + /** + * delete by rule + * @param rule + */ + void deleteByRule(Rule rule); + + /** + * delete by group + * @param ruleGroup + */ + void deleteByRuleGroup(RuleGroup ruleGroup); + + /** + * delete By Rule List + * @param rules + */ + void deleteByRuleList(List rules); + + /** + * delete By Rule Group List + * @param ruleGroups + */ + void deleteByRuleGroupList(List ruleGroups); + + /** + * Update metadata fields + * @param id + * @param subSystemId + * @param subSystemName + * @param departmentCode + * @param departmentName + * @param devDepartmentName + * @param tagCode + * @param tagName + */ + void updateMetadataFields(Long id, Long subSystemId, String subSystemName, String departmentCode, String departmentName, String devDepartmentName, String tagCode, String tagName); } diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDataSourceMappingDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDataSourceMappingDao.java index 6470589c..d1074a96 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDataSourceMappingDao.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDataSourceMappingDao.java @@ -19,6 +19,8 @@ import com.webank.wedatasphere.qualitis.rule.entity.Rule; import com.webank.wedatasphere.qualitis.rule.entity.RuleDataSourceMapping; +import java.util.List; + /** * @author howeye */ @@ -31,10 +33,29 @@ public interface RuleDataSourceMappingDao { */ RuleDataSourceMapping saveRuleDataSourceMapping(RuleDataSourceMapping ruleDataSourceMapping); + /** + * Save all + * @param ruleDataSourceMappingList + * @return + */ + List saveAll(List ruleDataSourceMappingList); + /** * Delete rule datasource mapping by rule * @param rule */ void deleteByRule(Rule rule); + /** + * delete By Rule List + * @param rules + */ + void deleteByRuleList(List rules); + + /** + * find By Rule List + * @param ruleList + * @return + */ + List findByRuleList(List ruleList); } diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDatasourceEnvDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDatasourceEnvDao.java new file mode 100644 index 00000000..557108c1 --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleDatasourceEnvDao.java @@ -0,0 +1,52 @@ +package com.webank.wedatasphere.qualitis.rule.dao; + +import com.webank.wedatasphere.qualitis.rule.entity.RuleDataSource; +import com.webank.wedatasphere.qualitis.rule.entity.RuleDataSourceEnv; + +import java.util.List; + +/** + * @author v_minminghe@webank.com + * @date 2022-08-03 16:00 + * @description + */ +public interface RuleDatasourceEnvDao { + + /** + * Find by env id. + * @param envId + * @return + */ + RuleDataSourceEnv findByEnvId(Long envId); + + /** + * Save all + * @param datasourceEnvList + */ + void saveAllRuleDataSourceEnv(List datasourceEnvList); + + /** + * Find all + * @return + */ + List findAllEnvName(); + + /** + * Delete all + * @param datasourceEnvList + */ + void deleteAll(List datasourceEnvList); + + /** + * Delete by id + * @param datasourceId + */ + void deleteByDataSourceId(Long datasourceId); + + /** + * find By Rule Data Source List + * @param ruleDataSourceList + * @return + */ + List findByRuleDataSourceList(List ruleDataSourceList); +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleGroupDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleGroupDao.java index 0d695263..0039c3a9 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleGroupDao.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleGroupDao.java @@ -17,7 +17,9 @@ package com.webank.wedatasphere.qualitis.rule.dao; import com.webank.wedatasphere.qualitis.rule.entity.RuleGroup; + import java.util.List; +import java.util.Map; /** * @author howeye @@ -45,6 +47,12 @@ public interface RuleGroupDao { */ void delete(RuleGroup ruleGroup); + /** + * Delete batch + * @param ruleGroups + */ + void deleteAll(List ruleGroups); + /** * Find by rule group name and project id * @param ruleGroupName @@ -59,4 +67,41 @@ public interface RuleGroupDao { * @return */ List findByProjectId(Long projectId); + + + /** + * Find By Project + * @param projectId + * @return + */ + List> findByProject(Long projectId); + + /** + * Find By ProjectId And ExistRule + * @param projectId + * @return + */ + List findByProjectIdAndExistRule(Long projectId); + + /** + * Find By ProjectId And NotExistRule + * @param projectId + * @return + */ + List findByProjectIdAndNotExistRule(Long projectId); + + /** + * Find rules by ids + * @param ruleGroupIds + * @return + */ + List findByIds(List ruleGroupIds); + + /** + * Find latest version + * @param ruleGroupName + * @param projectId + * @return + */ + RuleGroup findLatestVersionByRuleGroupNameAndProjectId(String ruleGroupName, Long projectId); } diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleLockDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleLockDao.java new file mode 100644 index 00000000..aca7f0cc --- /dev/null +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleLockDao.java @@ -0,0 +1,66 @@ +package com.webank.wedatasphere.qualitis.rule.dao; + +import com.webank.wedatasphere.qualitis.exception.UnExpectedRequestException; +import com.webank.wedatasphere.qualitis.rule.entity.RuleLock; + +import java.util.List; + +/** + * @author v_minminghe@webank.com + * @date 2022-12-19 9:44 + * @description + */ +public interface RuleLockDao { + + /** + * check multi lock if free status + * @param lockKeys + * @param loginUser + * @param expiredTimestamp + * @return + */ + boolean checkMultiLockIfFreeStatus(List lockKeys, String loginUser, Long expiredTimestamp); + + /** + * findByLockKeyWithLock + * @param lockKey + * @return + */ + RuleLock findByLockKeyWithLock(String lockKey); + + /** + * newLock + * @param lockKey + * @param holder + * @param timestamp + * @param status + * @return + */ + Integer newLock(String lockKey, String holder, Long timestamp, Integer status); + + /** + * acquireLock + * @param lockKey + * @param holder + * @param timestamp + * @return + */ + Integer acquireLock(String lockKey, String holder, Long timestamp); + + /** + * releaseLock + * @param lockKey + * @param holder + * @param timestamp + * @return + */ + Integer releaseLock(String lockKey, String holder, Long timestamp); + + /** + * modify + * @param ruleLock + * @return + * @throws UnExpectedRequestException + */ + RuleLock modify(RuleLock ruleLock) throws UnExpectedRequestException; +} diff --git a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleTemplateDao.java b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleTemplateDao.java index e625b03d..70f6b1d4 100644 --- a/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleTemplateDao.java +++ b/core/project/src/main/java/com/webank/wedatasphere/qualitis/rule/dao/RuleTemplateDao.java @@ -16,11 +16,14 @@ package com.webank.wedatasphere.qualitis.rule.dao; -import com.webank.wedatasphere.qualitis.entity.Department; import com.webank.wedatasphere.qualitis.entity.User; import com.webank.wedatasphere.qualitis.rule.entity.Template; +import org.springframework.data.domain.Page; import java.util.List; +import java.util.Map; +import java.util.Optional; +import java.util.Set; /** * @author howeye @@ -28,6 +31,7 @@ public interface RuleTemplateDao { /** * Find rule template by id + * * @param ruleTemplateId * @return */ @@ -35,14 +39,28 @@ public interface RuleTemplateDao { /** * Find all rule template + * * @param page * @param size + * @param templateType + * @param cnName + * @param enName + * @param dataSourceType + * @param verificationLevel + * @param verificationType + * @param createId + * @param modifyId + * @param devDepartmentId + * @param opsDepartmentId + * @param actionRange + * @param dataType * @return */ - List