Hi,
BIT V2.0 build 1006 for Linux have a issue.
When performing network test with default setting option#1(“Every bad packet generates an error”), it doesn’t generate an error every bad packet.
When performing network test with setting option#2(“High bad packet ratio generates an error”), it will generate an error every bad packet.
So I think the two options are contrary. Is that so?
Thanks!
BIT V2.0 build 1006 for Linux have a issue.
When performing network test with default setting option#1(“Every bad packet generates an error”), it doesn’t generate an error every bad packet.
When performing network test with setting option#2(“High bad packet ratio generates an error”), it will generate an error every bad packet.
So I think the two options are contrary. Is that so?
Thanks!
Comment