readfromkafka抛出值错误:不支持的信号:2

jslywgbw  于 2021-07-12  发布在  Spark
关注(0)|答案(1)|浏览(329)

目前,我试图与Apache·Kafka一起掌握Apache·比姆的窍门。
kafka服务正在(本地)运行,我用kafka控制台生成器编写了一些测试消息。
首先,我编写了这个java代码片段,用我熟悉的语言测试apachebeam。一切如期。

public class Main {

  public static void main(String[] args) {

    Pipeline pipeline = Pipeline.create();

    Read<Long, String> kafkaReader = KafkaIO.<Long, String>read()
        .withBootstrapServers("localhost:9092")
        .withTopic("beam-test")
        .withKeyDeserializer(LongDeserializer.class)
        .withValueDeserializer(StringDeserializer.class);

    kafkaReader.withoutMetadata();

    pipeline
        .apply("Kafka", kafkaReader
        ).apply(
          "Extract words", ParDo.of(new DoFn<KafkaRecord<Long, String>, String>() {
            @ProcessElement
          public void processElement(ProcessContext c){
              System.out.println("Key:" + c.element().getKV().getKey() + " | Value: " + c.element().getKV().getValue());
            }
        })
    );

    pipeline.run();
  }
}

我的目标是用python编写同样的代码,这就是我的目标´我目前在:

def run_pipe():

    with beam.Pipeline(options=PipelineOptions()) as p:
        (p
        | 'Kafka Unbounded' >> ReadFromKafka(consumer_config={'bootstrap.servers' : 'localhost:9092'}, topics=['beam-test'])
        | 'Test Print' >> beam.Map(print)
        )

if __name__ == '__main__':
    run_pipe()

现在来谈谈问题。尝试运行python代码时,出现以下错误:

(app) λ python ArghKafkaExample.py 
Traceback (most recent call last):
  File "ArghKafkaExample.py", line 22, in <module>
    run_pipe()
  File "ArghKafkaExample.py", line 10, in run_pipe
    (p
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\transforms\ptransform.py", line 1028, in __ror__
    return self.transform.__ror__(pvalueish, self.label)
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\transforms\ptransform.py", line 572, in __ror__
    result = p.apply(self, pvalueish, label)
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\pipeline.py", line 648, in apply
    return self.apply(transform, pvalueish)
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\pipeline.py", line 691, in apply
    pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\runners\runner.py", line 198, in apply
    return m(transform, input, options)
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\runners\runner.py", line 228, in apply_PTransform
    return transform.expand(input)
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\transforms\external.py", line 322, in expand
    self._expanded_components = self._resolve_artifacts(
  File "C:\Users\gamef\AppData\Local\Programs\Python\Python38\lib\contextlib.py", line 120, in __exit__
    next(self.gen)
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\transforms\external.py", line 372, in _service
    yield stub
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\transforms\external.py", line 523, in __exit__
    self._service_provider.__exit__(*args)
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\utils\subprocess_server.py", line 74, in __exit__
    self.stop()
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\utils\subprocess_server.py", line 133, in stop
    self.stop_process()
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\utils\subprocess_server.py", line 179, in stop_process
    return super(JavaJarServer, self).stop_process()
  File "C:\Users\gamef\git\BeamMeScotty\app\lib\site-packages\apache_beam\utils\subprocess_server.py", line 143, in stop_process
    self._process.send_signal(signal.SIGINT)
  File "C:\Users\gamef\AppData\Local\Programs\Python\Python38\lib\subprocess.py", line 1434, in send_signal
    raise ValueError("Unsupported signal: {}".format(sig))
ValueError: Unsupported signal: 2

从google上我发现,这与程序退出代码(比如strg+c)有关,但总的来说,我完全不知道问题出在哪里。
任何建议都会有帮助!
你好,帕斯卡

exdqitrt

exdqitrt1#

您的管道代码在这里似乎是正确的。这个问题是由于pythonsdk中kafka io的要求造成的。从模块文档:
这些转换目前由beam便携式runner(例如,portableflink和spark)以及dataflow runner支持。
本模块中提供的转换是在beamjavasdk中实现的跨语言转换。在管道构建过程中,pythonsdk将连接到java扩展服务来扩展这些转换。为了便于实现这一点,在beam-python管道中使用这些转换之前,需要进行少量的设置。
kafka io在python中实现为java中的跨语言转换,您的管道正在失败,因为您没有设置环境来执行跨语言转换。用外行的话来解释什么是跨语言转换:这意味着kafka转换实际上是在javasdk而不是pythonsdk上执行的,因此它可以利用java上现有的kafka代码。
有两个障碍阻止管道工作。更容易解决的一个问题是,只有我上面引用的运行程序支持跨语言转换,因此如果您使用直接运行程序运行此管道,它将不起作用,您需要在本地模式下切换到flink或spark运行程序。
更棘手的障碍是,您需要启动一个扩展服务,以便能够向管道中添加外部转换。您得到的stacktrace是因为beam试图扩展转换,但无法连接到扩展服务,扩展失败。
如果您仍然想尝试使用跨语言运行这个,尽管有额外的设置,我链接的文档包含运行扩展服务的说明。在我写这个答案的时候,这个特性还是新的,文档中可能有盲点。如果遇到问题,我建议您在apachebeam用户邮件列表或apachebeam slack频道上提问。

相关问题