Increase tolerances in util/monotonic_time tests

This is an attempt to fix #19974.
This commit is contained in:
Nick Mathewson 2016-12-07 11:08:54 -05:00
parent d6ca36defa
commit fce425e3ff
2 changed files with 7 additions and 2 deletions

5
changes/19974 Normal file
View File

@ -0,0 +1,5 @@
o Minor bugfixes (unit tests):
- Fix tolerances in unit tests for monotonic time comparisons between
nanoseconds and microseconds. Previously, we accepted a 10 us
difference only, which is not realistic on every platform's
clock_gettime(). Fixes bug 19974; bugfix on 0.2.9.1-alpha.

View File

@ -5531,9 +5531,9 @@ test_util_monotonic_time(void *arg)
tt_u64_op(msecc1, OP_GE, nsecc1 / 1000000);
tt_u64_op(usecc1, OP_GE, nsecc1 / 1000);
tt_u64_op(msec1, OP_LE, nsec1 / 1000000 + 1);
tt_u64_op(usec1, OP_LE, nsec1 / 1000 +10);
tt_u64_op(usec1, OP_LE, nsec1 / 1000 + 1000);
tt_u64_op(msecc1, OP_LE, nsecc1 / 1000000 + 1);
tt_u64_op(usecc1, OP_LE, nsecc1 / 1000 + 10);
tt_u64_op(usecc1, OP_LE, nsecc1 / 1000 + 1000);
done:
;