本文整理了Java中reactor.core.publisher.Mono.mergeWith()
方法的一些代码示例,展示了Mono.mergeWith()
的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Mono.mergeWith()
方法的具体详情如下:
包路径:reactor.core.publisher.Mono
类名称:Mono
方法名:mergeWith
[英]Merge emissions of this Mono with the provided Publisher. The element from the Mono may be interleaved with the elements of the Publisher.
[中]将此Mono的发行与提供的发行商合并。Mono中的元素可以与发布者的元素交错。
代码示例来源:origin: reactor/reactor-core
@Test
public void normal2() {
StepVerifier.create(Mono.just(1).mergeWith(Flux.just(2, 3)))
.expectNext(1, 2, 3)
.verifyComplete();
}
代码示例来源:origin: com.aol.cyclops/cyclops-reactor
/**
* @param other
* @return
* @see reactor.core.publisher.Mono#mergeWith(org.reactivestreams.Publisher)
*/
public final Flux<T> mergeWith(Publisher<? extends T> other) {
return boxed.mergeWith(other);
}
/**
代码示例来源:origin: g00glen00b/spring-samples
/**
* Follow-up web crawler command.
* @param result
* @param maxDepth
* @return
*/
public Flux<CrawlerResult> crawl(CrawlerResult result, int maxDepth) {
return Mono.just(result).mergeWith(crawl(getNextCommands(result), maxDepth));
}
代码示例来源:origin: rayokota/kafka-graphs
public Mono<ServerResponse> run(ServerRequest request) {
List<String> appIdHeaders = request.headers().header(X_KGRAPH_APPID);
String appId = request.pathVariable("id");
return request.bodyToMono(GraphAlgorithmRunRequest.class)
.flatMapMany(input -> {
log.debug("num iterations: {}", input.getNumIterations());
PregelGraphAlgorithm<Long, ?, ?, ?> algorithm = algorithms.get(appId);
GraphAlgorithmState state = algorithm.run(input.getNumIterations());
GraphAlgorithmStatus status = new GraphAlgorithmStatus(state);
Flux<GraphAlgorithmStatus> states =
proxyRun(appIdHeaders.isEmpty() ? group.getCurrentMembers().keySet() : Collections.emptySet(), appId, input);
return Mono.just(status).mergeWith(states);
})
.reduce((state1, state2) -> state1)
.flatMap(state ->
ServerResponse.ok()
.contentType(MediaType.APPLICATION_JSON)
.body(Mono.just(state), GraphAlgorithmStatus.class)
);
}
内容来源于网络,如有侵权,请联系作者删除!