Skip to content

Commit

Permalink
Merge pull request #21 from artemoliynyk/misc/pre-prelease-fixes
Browse files Browse the repository at this point in the history
Pre-release changes and fixes
  • Loading branch information
artemoliynyk committed Apr 16, 2024
2 parents 530b70d + b3bc648 commit eec649b
Show file tree
Hide file tree
Showing 5 changed files with 165 additions and 49 deletions.
2 changes: 1 addition & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "oxiflow"
version = "1.0.0-beta-2"
version = "1.0.0"
edition = "2021"
description = "Minimal HTTP loadtester with concurrency"
license = "GPL-3.0"
Expand Down
110 changes: 93 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,34 +1,109 @@
# oxiflow
Small load testing tool written in Rust
Small yet functional load testing tool written in Rust (oh yeah, "blazingly fast", of course)

## Usage
`oxiflow` tester can perform up to 255 concurrent request in batches up to 255 repeats.
# Quick start
Download version for your favourite OS, run in command line.

Following command will perform 3 concurrent requests (simultaneously) to the defined site and will repeat such requests batch 4 time, with response timeout of 2 seconds.
**Following command will:** Perform 4 concurrent request and repeat it 3 time, wait 2 seconds between repeats (after every 4 request), set response timeout to 1 seconds, print per-request summary after execution and print trace-level debug info while working.
```shell
./oxiflow -c4 -r3 -d2 -t1 --per-request -vvvv https://site.test/url/path
```
---

**This command will:** read URLs from the file `urls.txt`, will split all the URLs into 5 batches and run each batch concurrently with 1 second delay (every 5 requests) and response timeout of 2 seconds. It will show per-request information after execution and will how no extra log, but a progress bar.
```shell
./oxiflow -c5 -d1 -t2 --per-request -f urls.txt
```

# Detailed usage explanation
`oxiflow` tester can perform up to 255 concurrent request, with custom timeout between calls or batches and repeats.

Tester can work both with single URL and file list with URLs and methods.

Supported methods available get be retrieved by using argument `--help-methods`

## Single URL vs. File
There are two main exclusive arguments to provide test targets: File or URL.

### URL
If you have only one URL to call - you can provide just a URL and method (optionally) to call and tweak other parameters.

This will result in `3 x 4 = 12` total requests attempts. If server will fail to respond in 2 seconds – connection will be dropped and attempt will be recorded as an error.
```shell
# to call a singe URL with default method (GET)
./oxiflow https://site.test/critical-endpoint

# to call a singe URL with specific HTTP method
./oxiflow https://site.test/post-endpoint -mPOST
```


### File
But if you have a set of different URLs or you want to call the same URL but use few different HTTP methods - then the file is the choice here: `-f` or `--file`.

> To get a sample file just use argument `--help-file` and program will produce dummy text file with all the suported methods and features
```test
# this is sample file called url-list.txt
https://site.test/critical-endpoint
GET https://site.test/critical-endpoint
POST https://site.test/critical-endpoint
PUT https://site.test/critical-endpoint
```

Following command will call each URL defined in the file
```shell
oxiflow -t 2 -c 3 -r 4 http://SITENAME/
./oxiflow -f url-list.txt
```

With current limitation of 255 concurrent requests can be repeated 255 times, which will result in `255 ^ 2 = 65 024` requests.

There is no delay neither between requests nor between batches by default (and *at all* at the moment) – be aware. Option `-c100`, for example, will instantly perform 100 requests.
Comments in file are supported, use `#` character on the beginning of the line.

## Options
_At any time – refer to the help for currently available options._

- `-m, --method <METHOD>` – which HTTP method to use when calling URL (GET, POST, etc.). Number of supported methods might change, so the best way is to do `-mHELP` and all the supported methods will be listed
- `-c, --concurrent <CONCURRENT>` - define many request to send in parallel (might be systems dependent)
- `-r, --repeat <REPEAT>` - how many times to repeat defined batch of concurrent requests
- `-t, --timeout <TIMEOUT>` - response timeout in seconds, if server won't respond in required interval – connection will be terminated and requests will be considered as failed
- `-d, --delay <DELAY>` - delay in seconds between repeating requests batches.
## Common arguments
_At any time - refer to the help for currently available options (`-h`)._

- **method** (`-m`) – which HTTP method to use when calling single URL (GET, POST, etc.).
- **methods list** (`--help-methods`) – list currently supported methods
- **concurrent** (`-c`) - define many request to send in parallel (might be systems dependent, max. 255)
- **repeat** (`-r`) - how many times to repeat defined batch of concurrent requests (max. 255)
- **timeout** (`-t`) - response timeout in _seconds_, if server won't respond in required interval - connection will be terminated and requests will be considered as failed
- **delay** (`-d`) - delay in seconds between repeating requests batches.
Concurrent requests are performed simultaneously, without delay. Consider disabling concurrence with `-c0` if you want to have delay between each request
- `-f, --file <FILEPATH>` - use text file content as a URLs list
- `--per-request` - flag to enable per-URL report in the output
- **reporting** (`--per-request`) - will produce per-URL report output
- **verbosity level** (`-v`) - use to print more details during calls. This is accumulator argument, meaning the more `v` you add - more verbosity it provides. Where `-v` is some verbosity and `-vvvv` is a maximal (trace output).


## Concurrency and repeats
> **TL; DR:** concurrency argument with a single URL will multiply the very same URL C-times, while with file it will divide URLs list into C-sized pieces
Concurrency works a little bit different with a single URL and file-provided URLs, however the idea is very similar - it just forms the requests batch.

After batches were formed - they will be called concurrently and will repeated it according to the `-r` parameter.

Batching logic (concurrency):
- **With a single URL** - tester _**will form the batch**_ made of single URL and then will repeat this batch N-times
- **With multiple URLs in file** - all the _**URLs will be split**_ into batches and all the batches will be repeated N-times


Basically, this command will create batch of 5 URLs and will call it in parallel twice, performing 10 requests in total (`5 * 2`)
```shell
oxiflow -c 5 -r 2 http://localhost:8080/test-url.html
```


## Sample test flow
Following command will perform 3 concurrent requests (simultaneously) to the defined site and will repeat such requests batch 4 time, with timeout of 2 seconds and will trigger delay of 2 seconds between every batch

This will result in `3 x 4 = 12` total requests attempts. If server will fail to respond in 2 seconds - connection will be dropped and attempt will be recorded as an error.

```shell
oxiflow -t 2 -c 3 -r 4 -d 2 http://localhost:8080/test-url.html
```

With current limitation of 255 concurrent requests can be repeated 255 times, which will result in `255 ^ 2 = 65 024` requests.

There is no delay neither between requests nor between batches by default (and *at all* at the moment) – be aware. Option `-c100`, for example, will instantly perform 100 requests.


# Roadmap
This is pure pet fun project, but we all need some sort of plan, right?
Expand All @@ -45,6 +120,7 @@ Planned features:
- [x] PATCH
- [ ] Testing scenarios:
- [x] URLs file (with methods)
- [ ] Request content/body
- [ ] Pre-test actions (Auth)
- [ ] Reporting component
- [ ] Toggleable coloured output
Expand Down
69 changes: 50 additions & 19 deletions src/components/cli.rs
Original file line number Diff line number Diff line change
Expand Up @@ -12,52 +12,64 @@ use crate::components::http;
#[command(author, version, about, long_about = None)]
/// Simple, fast, concurrent load tester with minimal reporting
pub struct Args {
/// address to call
/// singe URL to call
#[arg(
conflicts_with("file"),
required_unless_present("file"),
required_unless_present("help_methods"),
required_unless_present("help_file"),
default_value = ""
)]
pub url: String,

/// config file with URLs definition to call
/// HTTP method to use for calling the singe URL
#[arg(long, short('m'), conflicts_with("file"), default_value = "GET")]
pub method: String,

/// text file with methods (optional) and URLs call, lines format: [METHOD] <URL>
#[arg(
long,
short('f'),
conflicts_with("url"),
required_unless_present("url"),
required_unless_present("help_methods"),
required_unless_present("help_file"),
default_value = ""
)]
pub file: String,

/// which HTTP method to use for a call, try -mHELP to get list of supported methods
#[arg(long, short('m'), conflicts_with("file"), default_value = "GET")]
pub method: String,

/// how many request to send concurrently
#[arg(long, short('c'), default_value_t = 1)]
pub concurrent: u8,

/// how many times to repeat
/// how many times to repeat either single URL call or of all URLs in file.
#[arg(long, short('r'), default_value_t = 1)]
pub repeat: u8,

/// request timeout in seconds
/// (seconds) request timeout, all requests lasting longer will be cancelled
#[arg(long, short('t'), default_value_t = 2)]
pub timeout: u8,

/// delay in seconds between repeating requests batches.
/// Concurrent requests performed concurrently with no delay
/// (seconds) delay in between repeating requests or batches.
/// If concurrency is greater than 1 - delay will occur between the batches, not individual URLs
#[arg(long, short('d'), default_value_t = 0)]
pub delay: u8,

/// Verbosity level accumulator, where '-v' some verbosity and '-vvvv' very verbose (trace)
#[arg(short('v'), action(ArgAction::Count))]
pub verbosity: u8,

/// Show per-request (per URL) stats
/// Show per-request report
#[arg(long("per-request"))]
pub per_request: bool,

/// Show supported methods
#[arg(long("help-methods"))]
pub help_methods: bool,

/// Produce sample URLs file
#[arg(long("help-file"))]
pub help_file: bool,
}

pub struct Cli {
Expand All @@ -75,10 +87,29 @@ impl Cli {
};
cli.set_log_level();

if !http::method_supported(&cli.args.method) {
println!("Defined method is not supported '{}'", &cli.args.method);
if cli.args.help_methods {
println!("Supported methods: {}", http::list_methods());
return Err(0);
}

if cli.args.help_file {
const URL: &str = "http://site.test/url/path?foo=bar";

println!("# this is comment, it will be ignored as well as empty lines\n\n");
println!("# following URL will be called with default method: GET");
println!("{}\n", URL);
println!("# to get all avaialable methods use '--help-methods' argument");
for method in http::SUPPORTED_HTTP_METHODS.iter() {
println!("{} {}", method, URL);
}
return Err(0);
}

if !http::method_supported(&cli.args.method) {
println!(
"Defined method is not supported '{}', try '--help-methods'",
&cli.args.method
);
return Err(crate::EXIT_UNKNOWN_METHOD);
}

Expand All @@ -98,7 +129,7 @@ impl Cli {
Cli::new(args_collection.into_iter())
}

/// Create a Cli instance with all th eargs
/// Create a Cli instance with all the args
pub fn new(args: IntoIter<OsString>) -> Result<Cli, clap::error::Error> {
Args::try_parse_from(args).map_or_else(
|err: clap::error::Error| Err(err),
Expand Down Expand Up @@ -136,7 +167,7 @@ mod tests {
#[test]
fn test_long_url() {
let test_args = self::create_iter_from_cmd(
"programm_name.exe -vvv --method TEST123 --concurrent 2 --repeat 3 --timeout 4 \
"program_name.exe -vvv --method TEST123 --concurrent 2 --repeat 3 --timeout 4 \
--delay 5 http://address.local/long-test",
);

Expand All @@ -155,7 +186,7 @@ mod tests {
#[test]
fn test_short_url() {
let test_args = self::create_iter_from_cmd(
"programm_name.exe -vvvv -mTEST123 -c2 -r3 -t4 -d5 http://address.local/short-test",
"program_name.exe -vvvv -mTEST123 -c2 -r3 -t4 -d5 http://address.local/short-test",
);

let cli = Cli::new(test_args)
Expand All @@ -173,7 +204,7 @@ mod tests {

#[test]
fn test_short_file() {
let test_args = self::create_iter_from_cmd("programm_name.exe -c3 -r2 -t1 -f filename.txt");
let test_args = self::create_iter_from_cmd("program_name.exe -c3 -r2 -t1 -f filename.txt");

let cli = Cli::new(test_args).map_or_else(|err| panic!("{}", err), |instance| instance);

Expand All @@ -191,7 +222,7 @@ mod tests {
#[should_panic]
fn test_wrong_values() {
let test_args =
self::create_iter_from_cmd("programm_name.exe --repeat TWICE http://error.local/");
self::create_iter_from_cmd("program_name.exe --repeat TWICE http://error.local/");

Cli::new(test_args).map_or_else(|err| panic!("{}", err), |instance| instance);
}
Expand All @@ -200,7 +231,7 @@ mod tests {
#[should_panic]
fn test_url_and_file_error() {
let test_args =
self::create_iter_from_cmd("programm_name.exe --file test.txt http://error.local/");
self::create_iter_from_cmd("program_name.exe --file test.txt http://error.local/");

Cli::new(test_args).map_or_else(|err| panic!("{}", err), |instance| instance);
}
Expand Down
5 changes: 5 additions & 0 deletions src/components/file_processor.rs
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,11 @@ impl<'a> FileProcessor<'a> {
let mut method = "GET";
let mut url = line.trim();

// comment line
if url.starts_with('#') {
return None;
}

// any spaces may indicate method
if let Some(pos) = url.find('\u{20}') {
method = &url[0..pos];
Expand Down
28 changes: 16 additions & 12 deletions src/components/worker.rs
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ impl Worker {
/// will show the progress or extra debug info.
///
/// This method will check how many time to repeat, how many concurrent requests to perform,
/// will perfor delay between repeats and will check the HTTP client reponse.
/// will perform delay between repeats and will check the HTTP client response.
///
/// All the responses will be checked and recorded in `WorkerResult` struct.
pub async fn execute(&mut self, mut requests: Vec<WorkerRequest>) -> Box<WorkerResult> {
Expand Down Expand Up @@ -73,23 +73,27 @@ impl Worker {
false => concurrent,
};

let mut start = 0;
let mut batch_start = 0;

while start < req_len {
let offset = start + step_size;
let end = if offset < req_len { offset } else { req_len };
while batch_start < req_len {
let offset = batch_start + step_size;
let batch_end = if offset < req_len { offset } else { req_len };

let requests_slice = &requests[start..end];
self.enqueue_requests(requests_slice, &mut result);
let requests_batch = &requests[batch_start..batch_end];
self.enqueue_requests(requests_batch, &mut result);
self.join_queue(&mut result, &mut progress_bar).await;

start = end;
}
batch_start = batch_end;

let remain_batches = batch_start < req_len;
let remain_repeats = iteration + 1 < self.repeat;

if self.repeat > 0 && self.delay > 0 {
log::info!("Waiting before repeating requests' batch {}s", self.delay);
// check for remaining repeats/batches to avoid trailing delay
if self.delay > 0 && (remain_batches || remain_repeats) {
log::info!("Waiting between requests/batches' batch {}s", self.delay);

thread::sleep(Duration::from_secs(self.delay as u64));
thread::sleep(Duration::from_secs(self.delay as u64));
}
}
}

Expand Down

0 comments on commit eec649b

Please sign in to comment.