Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Criterion giving negative measurements in some cases. #217

Open
cricketer94 opened this issue Aug 16, 2019 · 3 comments
Open

Criterion giving negative measurements in some cases. #217

cricketer94 opened this issue Aug 16, 2019 · 3 comments

Comments

@cricketer94
Copy link

I have seen this issue appearing before (#161 , #162), but I am wondering whether this has been fixed permanently or whether there are still some instances where the library will still give negative values.

In my case I cannot post too much information (its company proprietary information), but I am basically running Haxl code in a ghci like environment. Each Haxl module has multiple functions that take in JSON data and spit out some JSON data. I wish to benchmark two things, which involve benchmarking individual functions within these modules as well as benchmarking the running time of the entire module where all the functions run in parallel.

I know this is not much information but if any pointers regarding this issue could be given it would be great! An additional information that I can give without divulging the proprietary code I will try to give to the best of my ability.

@RyanGlScott
Copy link
Member

Some helpful information to know:

  • What exactly is the issue you are experiencing? The title mentions "negative measurements", but it isn't clear to me what that means. Are you getting a particular error message, for instance?
  • What version of criterion are you using?

@cricketer94
Copy link
Author

cricketer94 commented Aug 16, 2019

Hey @RyanGlScott
So basically something that is happening in the case of the run of certain functions, I get the following output (the name of the benchmark has been ommitted for obvious reasons):

As can be seen the estimates include times which are negative which does not make much sense.

time                 34.76 μs   (-64.23 μs .. 81.17 μs)
                     0.006 R²   (0.000 R² .. 0.956 R²)
mean                 21.58 ms   (81.57 μs .. 86.05 ms)
std dev              52.64 ms   (4.077 μs .. 70.63 ms)
variance introduced by outliers: 99% (severely inflated)

I would also like to point out that it is pretty much expected that the KDE curve is expected to be pretty flat, which is why there is a high influence of the outliers
The current version that is installed in the company is 1.4.1.0

@cricketer94
Copy link
Author

The above also shows the KDE I obtained from the above run
Screenshot 2019-08-16 at 13 52 01

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants