AppendToStreamAsync batched vs single event throughput disparity?

the code is looking at the APpend for the whole batch .

The following looks at the append average time.
If you run this you’ll see that the average append time for
batchsize=1 numevents=1000
batchsize=1000 numevents=1000
are quite different than what you might expect with the test you’ve done so far.

It might be useful to explain a bit more what you are trying to achieve and how your system will work to look at what type of benchmark your need to perform.

      int numberOfAppends = 0;
    var stopWatch = new Stopwatch();

    
    for (var i = 0; i < numevents; i++)
    {
        var eventBytes = Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(new
        {
            Id = i,
            SomeProperty =
                "somewhat long event property that babble on in order to generate a somewhat typical event size for my application",
        })).AsMemory();
        
        eventbuffer.Add(new EventData(uuidlist[i], "TestEventType", eventBytes));
      
        if (eventbuffer.Count == batchsize)
        { 
            stopWatch.Start();
            await client.AppendToStreamAsync("TestStream", StreamState.Any, eventbuffer);
            stopWatch.Stop();
            
            numberOfAppends++;
            eventbuffer.Clear();
        }
    }
    Console.WriteLine($"Appends took {sw.ElapsedMilliseconds}mS {sw.ElapsedMilliseconds / (double)numevents}"); //24,074mS
    Console.WriteLine($"Average append time : {stopWatch.ElapsedMilliseconds/ (double)numberOfAppends}ms/append, time spend in append = {stopWatch.ElapsedMilliseconds}ms "); 

}