[elbe-devel] [PATCH 1/3] commands test: Asynchronous testing

Olivier Dion dion at linutronix.de
Mon Jun 15 17:39:26 CEST 2020


On Mon, 15 Jun 2020, Torben Hohn <torben.hohn at linutronix.de> wrote:
> Regarding parallel tests, i am not sure, if we want/need that
> already. We would like to have the tests output an xml
> file with the results, so that jenkins can visualize the test results.
> (Sorry, that i forgot to mentionm this)

As discussed, instead of using a process pool, using unittest fiber for
Python3 would make this easier.

> Most of the (advanced) test runners support this.
> https://stackoverflow.com/questions/11241781/python-unittests-in-jenkins

Fortunately we have some code that generate Junit stuff already.  Maybe
we can reeuse this or do you want to use something specific?

> If we run several elbe builds in parallel in the same initvm they will
> fail, because there is only support for a single build per user, right
> now. This limit is artificial, and could be removed.
> But we also need to make sure, that elbe stops messing with the
> deamons process environment.
> But for now we have to work with this.
>
> So when we want to run builds in parallel, we need to do this
> via docker. This is already done via Jenkins controlling docker,
> and starting jobs inside the containers.
>
> We do not want to clutter the elbe source code with details, how
> the containers are named, etc.
> So the elbe code must not mess with docker.
>
> But the test code must support something like a filter, where every Nth
> test is run.
>
> for N=4 
> We will fire up 4 containers, and run
>
> "elbe test --parallel-N=4 --prallel-i=0" in container 0
> "elbe test --parallel-N=4 --prallel-i=1" in container 1
> "elbe test --parallel-N=4 --prallel-i=2" in container 2
> "elbe test --parallel-N=4 --prallel-i=3" in container 3

This assume assigning an ID to every test.  This can be done when
discovering them.  But that's no good because one container might end up
with all INITVM tests and the other with some very light weight tests.

For example:
0: INITVM
1: BASE
2: BASE
3: BASE
4: INITVM
5: BASE
6: BASE
7: BASE
8: INITVM
...
and so on.

What could be done is the following:

     - One container does all BASE or EXTEND tests.  This could even be
       done on the machine since no container is actually required for
       that.
     
     - Other containers do one specific INITVM test each.

However, the current level filtering is using the '<' operator.  What we
might want is to be able so specify this operator so we can do
'--level =INITVM'.  In that case, we will be able to cherry-pick only INITVM
tests and the modulo scheme would work.

> And then jenkins shall merge the resulting junit.xml files.
>
> We only need to parallelize the long running tests.
> But these are probably the ones tagged with INITVM level, right ?

Yes.

> another thing, that i would like to have, is that skipped tests
> report, that they have been skipped. We are on py3 for the tests,
> right ?
>
> https://docs.python.org/3/library/unittest.html#unittest.skipIf
>
> https://docs.python.org/3/library/unittest.html#unittest.SkipTest
>
> So the level should yield skipped tests.

This is rather difficult to do with the curent level.  It would be nice
to have @unittest.skipIf(ElbeTest.level != INITVM) and it's doable.

> The parallel-N thingy shall maybe not, because we want to merge
> the junit.xmls later.
>
> But, maybe we can make the merge code skip aware, and remove the
> skppped tests from the merged result, iff it has not been skipped in
> a parallel run.

But yeah, this become very cumbersome to handle all of this.  Adding
something like '--quiet-skip' could work.  It would default to False and
we can collect the result of skipped tests.  I just didn't put the
logic.

> But this needs more logic, because the cdrom rebuild has to be skipped
> also, when the initial build already failed.

This is done in the same test no?  It's just split in two parts.

-- 
Olivier Dion
Linutronix GmbH | Bahnhofstrasse 3 | D-88690 Uhldingen-Mühlhofen



More information about the elbe-devel mailing list