scala 查看BidiStream Gatling的全身React

mpgws1up  于 6个月前  发布在  Scala
关注(0)|答案(1)|浏览(69)

希望你做得很好,现在我在看bidiStream,现在我想看到的bidiStream的响应,但它不返回我任何东西,我不知道如果流正在运行或没有,作为下面这个脚本:

val paymentFinalizerProcess_Message = (session: io.gatling.core.session.Session) => { 
    val PaymentCustomize = Payment(
        //body
    )
    val paymentResultCustomize = PaymentResult(
        //body
    )
    PaymentFinalizerRequest(
        //body
    )
  }
  val PaymentFinalizerProcess_Call = grpc("BIDIStream")
    .bidiStream[PaymentFinalizerRequest, PaymentFinalizerHandlingResponse](RelayServerGrpc.METHOD_PAYMENT_FINALIZER_PROCESS, "PaymentFinalizerProcess")

  val speaker = scenario("PaymentFinalizerProcess")
  .exec(
    PaymentFinalizerProcess_Call
      .connect
      .header(Authorization)(s"Bearer string")
      .callOptions(CallOptions.DEFAULT.withDeadlineAfter(30, TimeUnit.SECONDS))
      .endCheck(statusCode is Status.Code.DEADLINE_EXCEEDED)
  )
  .exec(
    PaymentFinalizerProcess_Call
      .send(paymentFinalizerProcess_Message)
  )
  .exec(PaymentFinalizerProcess_Call.reconciliate(waitFor = StreamEnd))
  setUp(
    speaker.inject(
        atOnceUsers(1)
      )
  ).protocols(grpcConf)
}

字符串
我认为日志级别可能较低,所以我将logback.xml中的根级别设置为WARN级别,但没有任何变化,希望得到一点帮助附加信息:我已设置为TRACE级别并再次尝试,日志似乎没有显示响应流我添加了simulation.log记录

RUN test.scala.PaymentFinalizerProcess  paymentfinalizerprocess 1698283070288       3.9.0
USER    PaymentFinalizerProcess START   1698283071010
REQUEST     BIDIStream  1698283070849   1698283100857   OK   
USER    PaymentFinalizerProcess END 1698283100874


加特林回应:

10:17:47.826 [TRACE] i.g.a.Gatling$ - Starting
10:17:47.832 [INFO ] i.g.c.c.GatlingConfiguration$ - Gatling will try to load 'gatling.conf' config file as ClassLoader resource.
10:17:47.984 [TRACE] i.g.a.Gatling$ - Configuration loaded
10:17:47.984 [TRACE] i.g.a.Gatling$ - ActorSystem instantiated
10:17:48.279 [INFO ] a.e.s.Slf4jLogger - Slf4jLogger started
10:17:48.511 [DEBUG] i.n.u.i.l.InternalLoggerFactory - Using SLF4J as the default logging framework
10:17:48.628 [DEBUG] i.n.u.i.PlatformDependent0 - -Dio.netty.noUnsafe: false
10:17:48.629 [DEBUG] i.n.u.i.PlatformDependent0 - Java version: 17
10:17:48.629 [DEBUG] i.n.u.i.PlatformDependent0 - sun.misc.Unsafe.theUnsafe: available
10:17:48.630 [DEBUG] i.n.u.i.PlatformDependent0 - sun.misc.Unsafe.copyMemory: available
10:17:48.630 [DEBUG] i.n.u.i.PlatformDependent0 - sun.misc.Unsafe.storeFence: available
10:17:48.631 [DEBUG] i.n.u.i.PlatformDependent0 - java.nio.Buffer.address: available
10:17:48.633 [DEBUG] i.n.u.i.PlatformDependent0 - direct buffer constructor: unavailable
java.lang.UnsupportedOperationException: Reflective setAccessible(true) disabled
        at io.netty.util.internal.ReflectionUtil.trySetAccessible(ReflectionUtil.java:31)
        at io.netty.util.internal.PlatformDependent0$5.run(PlatformDependent0.java:288)
        at java.base/java.security.AccessController.doPrivileged(AccessController.java:318)
        at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:282)
        at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:333)
        at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:88)
        at io.netty.util.internal.ClassInitializerUtil.tryLoadClasses(ClassInitializerUtil.java:32)
        at io.netty.channel.epoll.Native.<clinit>(Native.java:72)
        at io.netty.channel.epoll.Epoll.<clinit>(Epoll.java:40)
        at io.gatling.netty.util.Transports.newEventLoopGroup(Transports.java:47)
        at io.gatling.app.Gatling$.start(Gatling.scala:85)
        at io.gatling.app.Gatling$.fromArgs(Gatling.scala:51)
        at io.gatling.app.Gatling$.fromSbtTestFramework(Gatling.scala:47)
        at io.gatling.sbt.GatlingTask.liftedTree1$1(GatlingTask.scala:51)
        at io.gatling.sbt.GatlingTask.execute(GatlingTask.scala:50)
        at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
10:17:48.634 [DEBUG] i.n.u.i.PlatformDependent0 - java.nio.Bits.unaligned: available, true
10:17:48.635 [DEBUG] i.n.u.i.PlatformDependent0 - jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable
java.lang.IllegalAccessException: class io.netty.util.internal.PlatformDependent0$7 cannot access class jdk.internal.misc.Unsafe (in module java.base) because module java.base does not export jdk.internal.misc to unnamed module @ba9adf9
        at java.base/jdk.internal.reflect.Reflection.newIllegalAccessException(Reflection.java:392)
        at java.base/java.lang.reflect.AccessibleObject.checkAccess(AccessibleObject.java:674)
        at java.base/java.lang.reflect.Method.invoke(Method.java:560)
        at io.netty.util.internal.PlatformDependent0$7.run(PlatformDependent0.java:410)
        at java.base/java.security.AccessController.doPrivileged(AccessController.java:318)
        at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:401)
        at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:333)
        at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:88)
        at io.netty.util.internal.ClassInitializerUtil.tryLoadClasses(ClassInitializerUtil.java:32)
        at io.netty.channel.epoll.Native.<clinit>(Native.java:72)
        at io.netty.channel.epoll.Epoll.<clinit>(Epoll.java:40)
        at io.gatling.netty.util.Transports.newEventLoopGroup(Transports.java:47)
        at io.gatling.app.Gatling$.start(Gatling.scala:85)
        at io.gatling.app.Gatling$.fromArgs(Gatling.scala:51)
        at io.gatling.app.Gatling$.fromSbtTestFramework(Gatling.scala:47)
        at io.gatling.sbt.GatlingTask.liftedTree1$1(GatlingTask.scala:51)
        at io.gatling.sbt.GatlingTask.execute(GatlingTask.scala:50)
        at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
        at java.base/java.lang.Thread.run(Thread.java:833)
10:17:48.635 [DEBUG] i.n.u.i.PlatformDependent0 - java.nio.DirectByteBuffer.<init>(long, int): unavailable
10:17:48.636 [DEBUG] i.n.u.i.PlatformDependent - sun.misc.Unsafe: available
10:17:48.649 [DEBUG] i.n.u.i.PlatformDependent - maxDirectMemory: 1073741824 bytes (maybe)
10:17:48.650 [DEBUG] i.n.u.i.PlatformDependent - -Dio.netty.tmpdir: C:\Users\TS-NGU~1.TRI\AppData\Local\Temp (java.io.tmpdir)
10:17:48.650 [DEBUG] i.n.u.i.PlatformDependent - -Dio.netty.bitMode: 64 (sun.arch.data.model)
10:17:48.650 [DEBUG] i.n.u.i.PlatformDependent - Platform: Windows
10:17:48.651 [DEBUG] i.n.u.i.PlatformDependent - -Dio.netty.maxDirectMemory: -1 bytes
10:17:48.651 [DEBUG] i.n.u.i.PlatformDependent - -Dio.netty.uninitializedArrayAllocationThreshold: -1
10:17:48.652 [DEBUG] i.n.u.i.CleanerJava9 - java.nio.ByteBuffer.cleaner(): available
10:17:48.652 [DEBUG] i.n.u.i.PlatformDependent - -Dio.netty.noPreferDirect: false
10:17:48.659 [DEBUG] i.n.c.MultithreadEventLoopGroup - -Dio.netty.eventLoopThreads: 8
10:17:48.677 [DEBUG] i.n.u.i.InternalThreadLocalMap - -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
10:17:48.677 [DEBUG] i.n.u.i.InternalThreadLocalMap - -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
10:17:48.682 [DEBUG] i.n.c.n.NioEventLoop - -Dio.netty.noKeySetOptimization: false
10:17:48.682 [DEBUG] i.n.c.n.NioEventLoop - -Dio.netty.selectorAutoRebuildThreshold: 512
10:17:48.690 [DEBUG] i.n.u.i.PlatformDependent - org.jctools-core.MpscChunkedArrayQueue: available
10:17:48.695 [TRACE] i.n.c.n.NioEventLoop - instrumented a special java.util.Set into: sun.nio.ch.WEPollSelectorImpl@47ddfd7
10:17:48.698 [TRACE] i.n.c.n.NioEventLoop - instrumented a special java.util.Set into: sun.nio.ch.WEPollSelectorImpl@3e57e06e
10:17:48.700 [TRACE] i.n.c.n.NioEventLoop - instrumented a special java.util.Set into: sun.nio.ch.WEPollSelectorImpl@67996c16
10:17:48.702 [TRACE] i.n.c.n.NioEventLoop - instrumented a special java.util.Set into: sun.nio.ch.WEPollSelectorImpl@18dc2a71
10:17:48.708 [TRACE] i.n.c.n.NioEventLoop - instrumented a special java.util.Set into: sun.nio.ch.WEPollSelectorImpl@42455023
10:17:48.711 [TRACE] i.n.c.n.NioEventLoop - instrumented a special java.util.Set into: sun.nio.ch.WEPollSelectorImpl@43908c95
10:17:48.714 [TRACE] i.n.c.n.NioEventLoop - instrumented a special java.util.Set into: sun.nio.ch.WEPollSelectorImpl@6f45a12d
10:17:48.716 [TRACE] i.n.c.n.NioEventLoop - instrumented a special java.util.Set into: sun.nio.ch.WEPollSelectorImpl@16338116
10:17:48.720 [TRACE] i.g.a.Gatling$ - Runner instantiated
10:17:48.720 [TRACE] i.g.app.Runner - Running
Gatling 3.9.5 is available! (you're using 3.9.0)
10:17:50.287 [TRACE] i.g.app.Runner - Simulation params built
10:17:50.288 [TRACE] i.g.app.Runner - Before hook executed
10:17:50.309 [TRACE] i.g.app.Runner - CoreComponents instantiated
10:17:50.341 [DEBUG] c.g.p.g.g.p.GrpcProtocol$ - Making warm up call with method grpc.health.v1.Health/Check
10:17:50.494 [DEBUG] i.n.u.ResourceLeakDetector - -Dio.netty.leakDetection.level: simple
10:17:50.495 [DEBUG] i.n.u.ResourceLeakDetector - -Dio.netty.leakDetection.targetRecords: 4
10:17:50.499 [DEBUG] i.n.b.AbstractByteBuf - -Dio.netty.buffer.checkAccessible: true
10:17:50.500 [DEBUG] i.n.b.AbstractByteBuf - -Dio.netty.buffer.checkBounds: true
10:17:50.501 [DEBUG] i.n.u.ResourceLeakDetectorFactory - Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@7a8e6023
10:17:50.554 [DEBUG] i.n.b.PooledByteBufAllocator - -Dio.netty.allocator.numHeapArenas: 8
10:17:50.554 [DEBUG] i.n.b.PooledByteBufAllocator - -Dio.netty.allocator.numDirectArenas: 8
10:17:50.555 [DEBUG] i.n.b.PooledByteBufAllocator - -Dio.netty.allocator.pageSize: 8192
10:17:50.555 [DEBUG] i.n.b.PooledByteBufAllocator - -Dio.netty.allocator.maxOrder: 9
10:17:50.555 [DEBUG] i.n.b.PooledByteBufAllocator - -Dio.netty.allocator.chunkSize: 4194304
10:17:50.555 [DEBUG] i.n.b.PooledByteBufAllocator - -Dio.netty.allocator.smallCacheSize: 256
10:17:50.555 [DEBUG] i.n.b.PooledByteBufAllocator - -Dio.netty.allocator.normalCacheSize: 64
10:17:50.555 [DEBUG] i.n.b.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedBufferCapacity: 32768
10:17:50.555 [DEBUG] i.n.b.PooledByteBufAllocator - -Dio.netty.allocator.cacheTrimInterval: 8192
10:17:50.555 [DEBUG] i.n.b.PooledByteBufAllocator - -Dio.netty.allocator.cacheTrimIntervalMillis: 0
10:17:50.555 [DEBUG] i.n.b.PooledByteBufAllocator - -Dio.netty.allocator.useCacheForAllThreads: false
10:17:50.555 [DEBUG] i.n.b.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedByteBuffersPerChunk: 1023
10:17:50.575 [DEBUG] i.n.c.DefaultChannelId - -Dio.netty.processId: 2940 (auto-detected)
10:17:50.577 [DEBUG] i.n.u.NetUtil - -Djava.net.preferIPv4Stack: false
10:17:50.578 [DEBUG] i.n.u.NetUtil - -Djava.net.preferIPv6Addresses: false
10:17:50.586 [DEBUG] i.n.u.NetUtilInitializations - Loopback interface: lo (Software Loopback Interface 1, 127.0.0.1)
10:17:50.587 [DEBUG] i.n.u.NetUtil - Failed to get SOMAXCONN from sysctl and file \proc\sys\net\core\somaxconn. Default: 200
10:17:50.605 [DEBUG] i.n.c.DefaultChannelId - -Dio.netty.machineId: 00:05:9a:ff:fe:3c:7a:00 (auto-detected)
10:17:50.627 [DEBUG] i.n.b.ByteBufUtil - -Dio.netty.allocator.type: pooled
10:17:50.628 [DEBUG] i.n.b.ByteBufUtil - -Dio.netty.threadLocalDirectBufferSize: 0
10:17:50.628 [DEBUG] i.n.b.ByteBufUtil - -Dio.netty.maxThreadLocalCharBufferSize: 16384
10:17:50.642 [DEBUG] i.n.u.Recycler - -Dio.netty.recycler.maxCapacityPerThread: 4096
10:17:50.642 [DEBUG] i.n.u.Recycler - -Dio.netty.recycler.ratio: 8
10:17:50.642 [DEBUG] i.n.u.Recycler - -Dio.netty.recycler.chunkSize: 32
10:17:50.642 [DEBUG] i.n.u.Recycler - -Dio.netty.recycler.blocking: false
10:17:50.675 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url] OUTBOUND SETTINGS: ack=false settings={ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
10:17:50.682 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
10:17:50.696 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url INBOUND SETTINGS: ack=false settings=[id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url
10:17:50.697 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url OUTBOUND SETTINGS: ack=true
10:17:50.714 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url OUTBOUND HEADERS: streamId=3 headers=GrpcHttp2OutboundHeaders[:authority: url, :path: /grpc.health.v1.Health/Check, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.46.0, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
10:17:50.722 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url OUTBOUND DATA: streamId=3 padding=0 endStream=true length=5 bytes=0000000000
10:17:50.724 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url INBOUND SETTINGS: ack=false settings={HEADER_TABLE_SIZE=4096, MAX_CONCURRENT_STREAMS=100, MAX_FRAME_SIZE=16384, MAX_HEADER_LIST_SIZE=65535}
10:17:50.724 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url OUTBOUND SETTINGS: ack=true
10:17:50.725 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url INBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=2147418112
10:17:50.726 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url INBOUND SETTINGS: ack=true
10:17:50.773 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url INBOUND HEADERS: streamId=3 headers=GrpcHttp2ResponseHeaders[:status: 200, content-type: application/grpc, x-unique-req-id: 6456D0E0_AA6B645628F31F906539BE3E1C1D1] padding=0 endStream=false
10:17:50.778 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url INBOUND DATA: streamId=3 padding=0 endStream=false length=7 bytes=00000000020801
10:17:50.778 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url OUTBOUND PING: ack=false bytes=1234
10:17:50.782 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url INBOUND HEADERS: streamId=3 headers=GrpcHttp2ResponseHeaders[grpc-status: 0, grpc-message: ] padding=0 endStream=true
10:17:50.783 [DEBUG] c.g.p.g.g.p.GrpcProtocol$ - Warm up request successful
10:17:50.786 [DEBUG] i.g.n.NettyClientHandler - [id: 0x18efc0a2, L:/10.80.74.193:51432 - R:url OUTBOUND GO_AWAY: lastStreamId=0 errorCode=0 length=0 bytes=
Simulation test.scala.PaymentFinalizerProcess started...
10:17:50.792 [TRACE] i.g.app.Runner - Asking Controller to start
10:17:50.804 [INFO ] i.g.c.s.w.ConsoleDataWriter - Initializing
10:17:50.804 [INFO ] i.g.c.s.w.LogFileDataWriter - Initializing
10:17:50.819 [INFO ] i.g.c.s.w.ConsoleDataWriter - Initialized
10:17:50.829 [INFO ] i.g.c.s.w.LogFileDataWriter - Initialized
10:17:50.842 [TRACE] i.g.c.c.i.o.OpenWorkload - Injecting 1 users in scenario PaymentFinalizerProcess, continue=false
10:17:50.842 [INFO ] i.g.c.c.i.o.OpenWorkload - Scenario io.gatling.core.scenario.Scenario@46ef27ae has finished injecting
10:17:50.843 [INFO ] i.g.c.c.i.Injector - All scenarios have finished injecting
10:17:50.850 [DEBUG] c.g.p.g.g.s.BidiStreamCall - Opening stream 'PaymentFinalizerProcess': Scenario 'PaymentFinalizerProcess', UserId #1
10:17:50.873 [DEBUG] i.g.n.NettyClientHandler - [id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:url OUTBOUND SETTINGS: ack=false settings={ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
10:17:50.874 [DEBUG] i.g.n.NettyClientHandler - [id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:url OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
10:17:50.883 [DEBUG] i.g.n.NettyClientHandler - [id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:url INBOUND SETTINGS: ack=false settings=[id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:url
10:17:50.884 [DEBUG] i.g.n.NettyClientHandler - [id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:url OUTBOUND SETTINGS: ack=true
10:17:50.885 [DEBUG] i.g.n.NettyClientHandler - [id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:url OUTBOUND HEADERS: streamId=3 headers=GrpcHttp2OutboundHeaders[:authority: url, :path: /relay.RelayServer/PaymentFinalizerProcess, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.46.0, authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.monLtqskKtLCuVNBz1Pw-oLPM3sWpC34E31HNa4tWlY, grpc-accept-encoding: gzip, grpc-timeout: 29963072u] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
10:17:50.896 [DEBUG] i.g.n.NettyClientHandler - [id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:url INBOUND SETTINGS: ack=false settings={HEADER_TABLE_SIZE=4096, MAX_CONCURRENT_STREAMS=100, MAX_FRAME_SIZE=16384, MAX_HEADER_LIST_SIZE=65535}
10:17:50.897 [DEBUG] i.g.n.NettyClientHandler - [id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:url OUTBOUND SETTINGS: ack=true
10:17:50.897 [DEBUG] i.g.n.NettyClientHandler - [id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:url INBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=2147418112
10:17:50.897 [DEBUG] i.g.n.NettyClientHandler - [id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:url INBOUND SETTINGS: ack=true
10:17:51.002 [DEBUG] c.g.p.g.g.s.BidiStreamCall - Sending message device {
  id: "123dev123"tling / testOnly 3s
}
payment {
  relayPaymentId: "thisId"
  amount: 6000.0
  posName: "posName"
  posVersion: "posVersion"
  additionalInfo: "sunt fugiat ut commodo qui"
}
paymentResult {
  status: "test"
  paymentDetailString: "ut labore aliquip exercitation velit"
}
command: "PAYMENT_RESULT"
log: "abc"
 with stream 'PaymentFinalizerProcess': Scenario 'PaymentFinalizerProcess', UserId #1
10:17:51.009 [DEBUG] i.g.n.NettyClientHandler - [id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:url] OUTBOUND DATA: streamId=3 padding=0 endStream=false length=185 bytes=00000000b40a0b0a093132336465763132331a600a2435393934343330362d626431362d346433632d616461622d663761643035306162383839110000000000...
10:17:51.010 [DEBUG] i.g.c.c.i.o.OpenWorkload - Start user #1
10:18:20.859 [DEBUG] i.g.n.NettyClientHandler - [id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:url OUTBOUND RST_STREAM: streamId=3 errorCode=8
10:18:20.870 [TRACE] c.g.p.g.g.s.BidiStreamCall -
>>>>>>>>>>>>>>>>>>>>>>>>>>
Stream Close:
BIDIStream - PaymentFinalizerProcess: OK
=========================
Session:
Session(PaymentFinalizerProcess,1,Map(gatling.grpc.channel -> ManagedChannelOrphanWrapper{delegate=ManagedChannelImpl{logId=5, target=url}}),OK,List(),io.gatling.core.protocol.ProtocolComponentsRegistry$$Lambda$489/0x00000135584eb710@5975dcbd,io.netty.channel.nio.NioEventLoop@447ebb27)
=========================
gRPC stream completion:
status=
DEADLINE_EXCEEDED, description: deadline exceeded after 29.995212400s. [closed=[], open=[[buffered_nanos=32121500, remote_addr=url]]]
<<<<<<<<<<<<<<<<<<<<<<<<<
10:18:20.870 [DEBUG] i.g.c.a.Exit - End user #1
10:18:20.871 [DEBUG] i.g.n.NettyClientHandler - [id: 0x3af7fdc3, L:/10.80.74.193:51433 - R:[url] OUTBOUND GO_AWAY: lastStreamId=0 errorCode=0 length=0 bytes=
10:18:20.875 [DEBUG] i.g.c.c.i.Injector - End user #PaymentFinalizerProcess
10:18:20.875 [INFO ] i.g.c.c.i.Injector - All users of scenario PaymentFinalizerProcess are stopped
10:18:20.876 [INFO ] i.g.c.c.i.Injector - Stopping
10:18:20.876 [INFO ] i.g.c.c.Controller - Injector has stopped, initiating graceful stop


我的logback.xml在src/test/resources中,内容如下:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>

    <appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
        <encoder>
            <pattern>%d{HH:mm:ss.SSS} [%-5level] %logger{15} - %msg%n%rEx</pattern>
        </encoder>
        <immediateFlush>false</immediateFlush>
    </appender>

    <!-- uncomment and set to DEBUG to log all failing HTTP requests -->
    <!-- uncomment and set to TRACE to log all HTTP requests -->
    <!--<logger name="io.gatling.http.engine.response" level="TRACE" />-->

    <!-- uncomment to log WebSocket events -->
    <logger name="io.gatling.http.action.ws.fsm" level="TRACE" />

    <!-- uncomment to log SSE events -->
    <logger name="io.gatling.http.action.sse.fsm" level="TRACE" />

    <root level="TRACE">
        <appender-ref ref="CONSOLE" />
    </root>

</configuration>

m528fe3b

m528fe3b1#

logback中有5个日志级别,重要性依次递增:TRACEDEBUGINFOWARNERROR
我认为日志级别可能很低,所以我将logback.xml中的根级别设置为WARN级别,但没有任何变化
你理解错了,要想在日志中看到流的所有消息,你应该使用最低级别的TRACE
同样,你试过ChatGPT吗?给予它引用的句子,你会意识到你的误解。

相关问题