Downloading a large file via HTTP provides an easy test of the complete transmit and receive datapath under a sustained high packet load. The time
command can be used to verify that the expected throughput was maintained during the download.
To eliminate other potential sources of delay, the test should download the same file multiple times. On the second and subsequent downloads, the file is likely to be served direct from cache, which will eliminate any disk-related delays. Averaging the times from multiple downloads minimises noise from any transient network conditions.
Create a 512MB file containing pseudo-random data, using e.g.:
dd if=/dev/urandom of=512mb bs=1M count=512
Copy your pseudo-random data file to a web server on your local network.
Create an iPXE script which downloads this file multiple times, timing the download for each time other than the first. For example:
#!ipxe imgfetch 512mb ; imgfree echo Starting test time imgfetch 512mb ; imgfree time imgfetch 512mb ; imgfree time imgfetch 512mb ; imgfree time imgfetch 512mb ; imgfree time imgfetch 512mb ; imgfree time imgfetch 512mb ; imgfree time imgfetch 512mb ; imgfree time imgfetch 512mb ; imgfree time imgfetch 512mb ; imgfree time imgfetch 512mb ; imgfree shell
TIME_CMD
enabled.net0
).dhcp net0
.