我的直觉可能是正确的使用包装和/或分类器。为了为构建文件找到正确的导入标记,我建议查看.cache/bazel/bazel_user/.../external/maven/BUILD file
将Post编辑成工作设置
我想使用Bazel构建工具测试我自己的异步flink流操作符RichAsyncFunction。
这与How to utilize Flink's TestHarness class?的问题基本相同
主要的问题是,我找不到导入的org.apache.flink.streaming.util.OneInputStreamOperatorTestHarness和所有其他代码所需的测试汽蒸运营商。
我试着遵循提供的答案,但是我很难理解bazel语法,我问自己是否可以用bazel使用这些导入。
我正在使用最新的Bazel版本和IntelliJ 2019.3.4与bazel插件。
链接的答案假设这个Maven依赖:
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-test-utils-junit</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>${flink.version}</version>
<scope>test</scope>
<type>test-jar</type>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-runtime_2.11</artifactId>
<version>${flink.version}</version>
<scope>test</scope>
<type>test-jar</type>
</dependency>我的工作空间看起来像这样
...
http_archive(
name = "rules_jvm_external",
sha256 = RULES_JVM_EXTERNAL_SHA,
strip_prefix = "rules_jvm_external-%s" % RULES_JVM_EXTERNAL_TAG,
url = "https://github.com/bazelbuild/rules_jvm_external/archive/%s.zip" % RULES_JVM_EXTERNAL_TAG,
)
load("@rules_jvm_external//:defs.bzl", "maven_install")
load("@rules_jvm_external//:defs.bzl", "artifact")
load("@rules_jvm_external//:specs.bzl", "maven")
load("@io_grpc_grpc_java//:repositories.bzl", "IO_GRPC_GRPC_JAVA_ARTIFACTS")
load("@io_grpc_grpc_java//:repositories.bzl", "IO_GRPC_GRPC_JAVA_OVERRIDE_TARGETS")
maven_install(
artifacts = [
...
"org.apache.commons:commons-lang3:3.9",
"org.javatuples:javatuples:1.2",
"junit:junit:4.13",
"org.testcontainers:testcontainers:1.14.1",
"org.testcontainers:kafka:1.14.1",
"org.testcontainers:postgresql:1.14.1",
"commons-io:commons-io:2.6",
"com.google.code.findbugs:jsr305:1.3.9",
"com.google.errorprone:error_prone_annotations:2.0.18",
"com.google.j2objc:j2objc-annotations:1.1",
"com.google.protobuf:protobuf-java:3.11.1",
"com.google.protobuf:protobuf-java-util:3.6.1",
"info.picocli:picocli:4.1.0",
"org.slf4j:slf4j-log4j12:1.7.5",
"org.slf4j:slf4j-api:1.7.28",
"com.github.jasync-sql:jasync-postgresql:1.0.11",
"com.github.jasync-sql:jasync-common:1.0.11",
"org.postgresql:postgresql:42.2.5",
"org.mongodb:mongodb-driver-reactivestreams:4.0.2",
"org.mongodb:mongodb-driver-core:4.0.2",
"org.mongodb:bson:4.0.2",
"org.reactivestreams:reactive-streams:1.0.3",
"joda-time:joda-time:2.9.7",
"org.apache.kafka:kafka-clients:2.4.0",
# "io.grpc:grpc-netty-shaded:%s" % GRPC_JAVA_VERSION,
# "io.grpc:grpc-protobuf:%s" % GRPC_JAVA_VERSION,
# "io.grpc:grpc-stub:%s" % GRPC_JAVA_VERSION,
"org.apache.flink:flink-core:%s" % FLINK_VERSION,
"org.apache.flink:flink-java:%s" % FLINK_VERSION,
"org.apache.flink:flink-streaming-java_%s:%s" % (SCALA_VERSION, FLINK_VERSION),
"org.apache.flink:flink-connector-kafka-0.11_%s:%s" % (SCALA_VERSION, FLINK_VERSION),
"org.apache.flink:flink-cep_2.11:%s" % FLINK_VERSION,
] + IO_GRPC_GRPC_JAVA_ARTIFACTS,
generate_compat_repositories = True,
override_targets = IO_GRPC_GRPC_JAVA_OVERRIDE_TARGETS,
repositories = [
"https://jcenter.bintray.com/",
"https://maven.google.com",
"https://repo1.maven.org/maven2",
],
)
maven_install(
name = "testing",
artifacts = [
maven.artifact(
group = "org.apache.flink",
artifact = "flink-runtime_2.11",
version = FLINK_VERSION,
classifier = "tests",
packaging = "test-jar",
),
maven.artifact(
group = "org.apache.flink",
artifact = "flink-streaming-java_2.11",
version = FLINK_VERSION,
classifier = "tests",
packaging = "test-jar",
),
maven.artifact(
group = "org.apache.flink",
artifact = "flink-test-utils-junit",
version = FLINK_VERSION,
),
"org.apache.flink:flink-tests:%s" % FLINK_VERSION,
],
repositories = [
"https://jcenter.bintray.com/",
"https://maven.google.com",
"https://repo1.maven.org/maven2",
],
)
load("@maven//:compat.bzl", "compat_repositories")
compat_repositories()
...测试maven install "group“是我试图找出如何获得所需的导入依赖项的尝试,因为据我所知,在Bazels maven语法中没有显式的"type”和"scope“。
测试生成文件如下所示
load("@rules_java//java:defs.bzl", "java_test")
load("@rules_jvm_external//:specs.bzl", "maven")
load("@rules_jvm_external//:defs.bzl", "artifact", "maven_install")
java_test(
name = "some_module",
size = "medium",
srcs = ["DataOperatorTests.java"],
tags = [
"docker",
"integration",
],
test_class = "org.some.project.DataOperatorTests",
deps = [
"@maven//:com_google_protobuf_protobuf_java",
"@maven//:junit_junit_4_13",
"@maven//:org_apache_flink_flink_connector_kafka_0_11_2_11",
"@maven//:org_apache_flink_flink_core",
"@maven//:org_apache_flink_flink_java",
"@maven//:org_javatuples_javatuples",
"@maven//:org_mongodb_bson",
"@maven//:org_mongodb_mongodb_driver_core",
"@maven//:org_mongodb_mongodb_driver_reactivestreams",
"@maven//:org_testcontainers_kafka",
"@maven//:org_testcontainers_testcontainers",
"@maven//:org_apache_flink_flink_streaming_java_2_11",
"@testing//:org_apache_flink_flink_runtime_2_11_tests",
"@testing//:org_apache_flink_flink_streaming_java_2_11_tests",
"@testing//:org_apache_flink_flink_test_utils_junit",
"@testing//:org_apache_flink_flink_tests",
],
)
test_suite(name = "smoke",tags = ["-docker","-internal",],)
test_suite(name = "unit",tags = ["-internal","unit",],)
test_suite(name = "integration",tags = ["-internal","integration",],)
test_suite(name = "internal")使用上述设置,将不再发生以下错误--将其留作参考,
包似乎是存在的,但是在实际的测试构建文件中,我在同步过程中得到了这个错误。
Error:(5, 1) no such target '@testing//:org_apache_flink_flink_streaming_java_2_11': target 'org_apache_flink_flink_streaming_java_2_11' not declared in package '' defined by /home/user/.cache/bazel/_bazel_user/7b62e4e31c70ee640c6d33972433da28/external/testing/BUILD and referenced by '//core/src/test/java/org/some/project/some_module:some_module'发布于 2020-05-11 10:14:56
我自己解决了这个问题。调整上述代码,使之成为合适的解决方案。
总结一下。
当maven Repository像这样提供时:
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-test-utils-junit</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>${flink.version}</version>
<scope>test</scope>
<type>test-jar</type>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-runtime_2.11</artifactId>
<version>${flink.version}</version>
<scope>test</scope>
<type>test-jar</type>
</dependency>对应的Bazel工作区文件必须有如下所示的条目:
load("@rules_jvm_external//:defs.bzl", "maven_install")
load("@rules_jvm_external//:defs.bzl", "artifact")
load("@rules_jvm_external//:specs.bzl", "maven")
maven_install(
...
artifacts = [
maven.artifact(
group = "org.apache.flink",
artifact = "flink-runtime_2.11",
version = FLINK_VERSION,
classifier = "tests",
packaging = "test-jar",
),
maven.artifact(
group = "org.apache.flink",
artifact = "flink-streaming-java_2.11",
version = FLINK_VERSION,
classifier = "tests",
packaging = "test-jar",
),
maven.artifact(
group = "org.apache.flink",
artifact = "flink-test-utils-junit",
version = FLINK_VERSION,
),
"org.apache.flink:flink-tests:%s" % FLINK_VERSION,
],
repositories = [
"https://jcenter.bintray.com/",
"https://maven.google.com",
"https://repo1.maven.org/maven2",
],
)详细的依赖性解释可以在这里找到:https://github.com/bazelbuild/rules_jvm_external#detailed-dependency-information-specifications和这里:https://github.com/bazelbuild/rules_jvm_external/blob/master/docs/api.md#mavenartifact
要在构建文件中找到导入的结果目标,请使用@Jin中提到的
bazel query @maven//:all | grep flink_flink_streaming在大多数情况下,目标可能只有后缀_tests。
这里是构建文件的必要导入:
load("@rules_jvm_external//:specs.bzl", "maven")
load("@rules_jvm_external//:defs.bzl", "artifact", "maven_install")
java_test(
name = "some_module",
size = "medium",
srcs = ["DataOperatorTests.java"],
tags = [
"docker",
"integration",
],
test_class = "org.some.project.DataOperatorTests",
deps = [
...
"@maven//:org_apache_flink_flink_runtime_2_11_tests",
"@maven//:org_apache_flink_flink_streaming_java_2_11_tests",
"@maven//:org_apache_flink_flink_test_utils_junit",
"@maven//:org_apache_flink_flink_tests",
],
)发布于 2022-09-29 23:37:40
Harness文件是带有test-jar类型的“flink流-java_2.12”工件的一部分。因此,如果您的项目代码同时使用测试库和源代码库,那么最好使用分类器来处理您的测试依赖项。
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.12</artifactId>
<version>${flink.version}</version>
<scope>test</scope>
<classifier>tests</classifier>
<type>test-jar</type>
</dependency>检查intellij中的外部库,看看现在是否有两个jars,一个用于源代码,另一个用于测试。
https://stackoverflow.com/questions/61697751
复制相似问题