Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extra memory consumption in ramping-arrival-rate if big map is used #3734

Closed
mkosta opened this issue May 8, 2024 · 1 comment
Closed

Extra memory consumption in ramping-arrival-rate if big map is used #3734

mkosta opened this issue May 8, 2024 · 1 comment
Assignees
Labels

Comments

@mkosta
Copy link

mkosta commented May 8, 2024

Brief summary

Noticed in ramping-arrival-rate executor that if the big map is filled at init stage and then used in the function, memory consumption is jumping to 10% per thread compared to 1% if the map is filled in the function. There are different scenarios and ration is not necessary 1 to 10, in some cases worse. For example increasing pre allocated vus x2 will increase memory per thread x2 times.

execute this strip down version of script and then comment out eventIdsMap in init stage and uncomment in default function and watch the difference in htop or whatever memory monitoring tool

  let eventIdsMap = generateEventIds(100000);
  
  export const options = {
    scenarios: {
      contacts: {
        executor: 'ramping-arrival-rate',
  
        // Start iterations per `timeUnit`
        startRate: 5000,
  
        // Start `startRate` iterations per minute
        timeUnit: '10m',
  
        // Pre-allocate necessary VUs.
        preAllocatedVUs: 250,
  
        stages: [
          { target: 5000, duration: '10m' },
          { target: 6000, duration: '20m' },
          { target: 6000, duration: '40m' },
          { target: 60, duration: '20m' },
        ],
        exec: "default"
      },
    },
  };

  export default () =>
  {
    //let eventIdsMap = generateEventIds(100000);    
    
        let randomIntEventId = Array.from(eventIdsMap.keys())[
        randomIntBetween(0, eventIdsMap.size-1)
        ]; //take random key
    
        console.log(randomIntEventId);  
  };

  export function generateEventIds(maxEventIds) {
    let dataMap = new Map();
  
    for (let i = 1; i <= maxEventIds; i++) {
      dataMap.set(i, i); //key, eventId
    }
    return dataMap;
  }
  
  export function randomIntBetween(min, max) {
    return Math.floor(Math.random() * (max - min + 1) + min);
  }

init stage map fill
image
default function map fill
image

k6 version

0.50

OS

ubuntu 22.04

Docker version and image (if applicable)

No response

Steps to reproduce the problem

execute this strip down version of provided script and then comment out eventIdsMap in init stage and uncomment in default function and watch the difference in htop or whatever memory monitoring tool

Expected behaviour

memory should not be consumed at this level. in my understanding init stage should share the map and filling 100k map for each iteration should be worse than filling it in init stage

Actual behaviour

for this particular executor there is this problem with massive memory usage when the map is around 100k the bigger the worse. other executors i tried like shared iterations, pervuiterations on comparable rps are consuming little memory

@mstoykov
Copy link
Collaborator

Hi @mkosta, sorry for the slow reply - I was off and the github action doesn't take this into account 🤦.

Looking at your code there is nothing that is ramping-arrival-rate specific.

You are having a big allocation in the init context. And each VU executes it separately as explained in Test lifecycle.

This means that with 250 VUs you are having 250 copies of this in memory.

The particular case you have will lend itself pretty okay to using a SharedArray as you basically have a map as an array.

Don't know if this is the real example, but if not please use the community forum to ask questions aren't using k6.

@mstoykov mstoykov closed this as not planned Won't fix, can't repro, duplicate, stale May 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants