[v2] Remove ARB_timer_query / EXT_timer_query from quick.py

Submitted by Mark Janes on Nov. 18, 2014, 6:39 p.m.

Details

Message ID 1416335966-11961-1-git-send-email-mark.a.janes@intel.com
State New
Headers show

Not browsing as part of any series.

Commit Message

Mark Janes Nov. 18, 2014, 6:39 p.m.
EXT_timer_query and ARB_timer_query tests fail intermittently, causing
confusion for developers running quick.py to find regressions.  These
tests have always been intermittent, and people generally know to
ignore them when they fail.

However, if everyone ignores a test, there is no point in running it
all the time.
---
 tests/quick.py | 2 ++
 1 file changed, 2 insertions(+)

Patch hide | download patch | download mbox

diff --git a/tests/quick.py b/tests/quick.py
index 8762d7d..0856f75 100644
--- a/tests/quick.py
+++ b/tests/quick.py
@@ -12,3 +12,5 @@  del profile.tests['shaders']['glsl-fs-inline-explosion']
 del profile.tests['shaders']['glsl-fs-unroll-explosion']
 del profile.tests['shaders']['glsl-vs-inline-explosion']
 del profile.tests['shaders']['glsl-vs-unroll-explosion']
+del profile.tests['spec']['EXT_timer_query']
+del profile.tests['spec']['ARB_timer_query']

Comments

I'm not an authority on the subject, but for what it's worth you have my
rb.

Reviewed-by: Dylan Baker <dylanx.c.baker@intel.com>

On Tuesday, November 18, 2014 10:39:26 AM Mark Janes wrote:
> EXT_timer_query and ARB_timer_query tests fail intermittently, causing
> confusion for developers running quick.py to find regressions.  These
> tests have always been intermittent, and people generally know to
> ignore them when they fail.
> 
> However, if everyone ignores a test, there is no point in running it
> all the time.
> ---
>  tests/quick.py | 2 ++
>  1 file changed, 2 insertions(+)
> 
> diff --git a/tests/quick.py b/tests/quick.py
> index 8762d7d..0856f75 100644
> --- a/tests/quick.py
> +++ b/tests/quick.py
> @@ -12,3 +12,5 @@ del profile.tests['shaders']['glsl-fs-inline-explosion']
>  del profile.tests['shaders']['glsl-fs-unroll-explosion']
>  del profile.tests['shaders']['glsl-vs-inline-explosion']
>  del profile.tests['shaders']['glsl-vs-unroll-explosion']
> +del profile.tests['spec']['EXT_timer_query']
> +del profile.tests['spec']['ARB_timer_query']
> -- 
> 2.1.3
>
On Tue, Nov 18, 2014 at 1:39 PM, Mark Janes <mark.a.janes@intel.com> wrote:
> EXT_timer_query and ARB_timer_query tests fail intermittently, causing
> confusion for developers running quick.py to find regressions.  These
> tests have always been intermittent, and people generally know to
> ignore them when they fail.
>
> However, if everyone ignores a test, there is no point in running it
> all the time.

FWIW, on nv50 and nvc0, EXT_timer_query time-elapsed and
ARB_timer_query query GL_TIMESTAMP reliably fail, and the rest
reliably pass. I don't remember ever seeing inconsistent behaviour.
FWIW I don't really know what these extensions do, or what the tests
check for, but just thought I'd provide the data point.

> ---
>  tests/quick.py | 2 ++
>  1 file changed, 2 insertions(+)
>
> diff --git a/tests/quick.py b/tests/quick.py
> index 8762d7d..0856f75 100644
> --- a/tests/quick.py
> +++ b/tests/quick.py
> @@ -12,3 +12,5 @@ del profile.tests['shaders']['glsl-fs-inline-explosion']
>  del profile.tests['shaders']['glsl-fs-unroll-explosion']
>  del profile.tests['shaders']['glsl-vs-inline-explosion']
>  del profile.tests['shaders']['glsl-vs-unroll-explosion']
> +del profile.tests['spec']['EXT_timer_query']
> +del profile.tests['spec']['ARB_timer_query']
> --
> 2.1.3
>
> _______________________________________________
> Piglit mailing list
> Piglit@lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/piglit
Ilia Mirkin <imirkin@alum.mit.edu> writes:

> FWIW, on nv50 and nvc0, EXT_timer_query time-elapsed and
> ARB_timer_query query GL_TIMESTAMP reliably fail, and the rest
> reliably pass. I don't remember ever seeing inconsistent behaviour.
> FWIW I don't really know what these extensions do, or what the tests
> check for, but just thought I'd provide the data point.

We probably see the failure more often because we run test with gbm, so
the CPU is consistently pegged on all cores.  Comparing cpu times with
gpu times is likely to be error-prone in this situation.

Since it only affects our automated system, we'll exclude it when we
invoke piglit.

-Mark
On Tue, Nov 18, 2014 at 02:48:09PM -0800, Mark Janes wrote:
> Ilia Mirkin <imirkin@alum.mit.edu> writes:
> 
> > FWIW, on nv50 and nvc0, EXT_timer_query time-elapsed and
> > ARB_timer_query query GL_TIMESTAMP reliably fail, and the rest
> > reliably pass. I don't remember ever seeing inconsistent behaviour.
> > FWIW I don't really know what these extensions do, or what the tests
> > check for, but just thought I'd provide the data point.
> 
> We probably see the failure more often because we run test with gbm, so
> the CPU is consistently pegged on all cores.  Comparing cpu times with
> gpu times is likely to be error-prone in this situation.
> 
> Since it only affects our automated system, we'll exclude it when we
> invoke piglit.
> 
> -Mark

I think we should change the "Couldn't find appropriate number of iterations" to
a skip. That seems to be the common failure on GBM (for me). All other failures
are likely to be real^winteresting failures.
Ben Widawsky <ben@bwidawsk.net> writes:

> On Tue, Nov 18, 2014 at 02:48:09PM -0800, Mark Janes wrote:
>> Ilia Mirkin <imirkin@alum.mit.edu> writes:
>> 
>> > FWIW, on nv50 and nvc0, EXT_timer_query time-elapsed and
>> > ARB_timer_query query GL_TIMESTAMP reliably fail, and the rest
>> > reliably pass. I don't remember ever seeing inconsistent behaviour.
>> > FWIW I don't really know what these extensions do, or what the tests
>> > check for, but just thought I'd provide the data point.
>> 
>> We probably see the failure more often because we run test with gbm, so
>> the CPU is consistently pegged on all cores.  Comparing cpu times with
>> gpu times is likely to be error-prone in this situation.
>> 
>> Since it only affects our automated system, we'll exclude it when we
>> invoke piglit.
>> 
>> -Mark
>
> I think we should change the "Couldn't find appropriate number of iterations" to
> a skip. That seems to be the common failure on GBM (for me). All other failures
> are likely to be real^winteresting failures.

The failure I've seen on the older machines is "GPU time didn't match
CPU time".  I've already disabled this test in our environment, so it's
a non-issue.

-Mark
On Thu, Nov 20, 2014 at 3:10 PM, Mark Janes <mark.a.janes@intel.com> wrote:
> Ben Widawsky <ben@bwidawsk.net> writes:
>
>> On Tue, Nov 18, 2014 at 02:48:09PM -0800, Mark Janes wrote:
>>> Ilia Mirkin <imirkin@alum.mit.edu> writes:
>>>
>>> > FWIW, on nv50 and nvc0, EXT_timer_query time-elapsed and
>>> > ARB_timer_query query GL_TIMESTAMP reliably fail, and the rest
>>> > reliably pass. I don't remember ever seeing inconsistent behaviour.
>>> > FWIW I don't really know what these extensions do, or what the tests
>>> > check for, but just thought I'd provide the data point.
>>>
>>> We probably see the failure more often because we run test with gbm, so
>>> the CPU is consistently pegged on all cores.  Comparing cpu times with
>>> gpu times is likely to be error-prone in this situation.
>>>
>>> Since it only affects our automated system, we'll exclude it when we
>>> invoke piglit.
>>>
>>> -Mark
>>
>> I think we should change the "Couldn't find appropriate number of iterations" to
>> a skip. That seems to be the common failure on GBM (for me). All other failures
>> are likely to be real^winteresting failures.
>
> The failure I've seen on the older machines is "GPU time didn't match
> CPU time".  I've already disabled this test in our environment, so it's
> a non-issue.

FWIW "GPU time didn't match CPU time" is the reason for the nv50/nvc0
failures as well:

http://people.freedesktop.org/~imirkin/nv50-comparison/nv92-2014-03-09-mupuf/spec/ARB_timer_query/query%20GL_TIMESTAMP.html
http://people.freedesktop.org/~imirkin/nv50-comparison/nv92-2014-03-09-mupuf/spec/EXT_timer_query/time-elapsed.html
http://people.freedesktop.org/~imirkin/nvc0-comparison/nve7-2014-06-06-mupuf-gs5/spec/ARB_timer_query/query%20GL_TIMESTAMP.html
http://people.freedesktop.org/~imirkin/nvc0-comparison/nve7-2014-06-06-mupuf-gs5/spec/EXT_timer_query/time-elapsed.html

The gpu time is like 5e-6 while cpu time is like 0.1.

  -ilia
I think it would not hurt to move these two tests into the serialized
category and see if this helps.  If that does not work adding a note about
older systems ( and the use of the gbm backend ) is probably a good idea.
On Nov 18, 2014 12:39 PM, "Mark Janes" <mark.a.janes@intel.com> wrote:

> EXT_timer_query and ARB_timer_query tests fail intermittently, causing
> confusion for developers running quick.py to find regressions.  These
> tests have always been intermittent, and people generally know to
> ignore them when they fail.
>
> However, if everyone ignores a test, there is no point in running it
> all the time.
> ---
>  tests/quick.py | 2 ++
>  1 file changed, 2 insertions(+)
>
> diff --git a/tests/quick.py b/tests/quick.py
> index 8762d7d..0856f75 100644
> --- a/tests/quick.py
> +++ b/tests/quick.py
> @@ -12,3 +12,5 @@ del profile.tests['shaders']['glsl-fs-inline-explosion']
>  del profile.tests['shaders']['glsl-fs-unroll-explosion']
>  del profile.tests['shaders']['glsl-vs-inline-explosion']
>  del profile.tests['shaders']['glsl-vs-unroll-explosion']
> +del profile.tests['spec']['EXT_timer_query']
> +del profile.tests['spec']['ARB_timer_query']
> --
> 2.1.3
>
> _______________________________________________
> Piglit mailing list
> Piglit@lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/piglit
>