Report generated on 18-Nov-2019 at 11:13:05

Environment

Platform Linux-4.15.0-1028-gcp-x86_64-with-debian-stretch-sid
Python 3.5.6

Summary

6 tests ran in 1.66 seconds.

(Un)check the boxes to filter the results.

6 passed 0 skipped 0 failed 0 errors 0 expected failures 0 unexpected passes 0 rerun

Results

Result Test Duration Links
Passed pytest_pilot/tests/test_main.py::test_ensure_pytest_pilot_installed 0.28
----------------------------- Captured stdout call -----------------------------
running: /home/travis/miniconda/envs/test-environment/bin/python /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest.py --basetemp=/tmp/pytest-of-travis/pytest-0/testdir/test_ensure_pytest_pilot_installed0/runpytest-0 /tmp/pytest-of-travis/pytest-0/testdir/test_ensure_pytest_pilot_installed0 --trace-config --basetemp=/tmp/pytest-of-travis/pytest-0/testdir/basetemp in: /tmp/pytest-of-travis/pytest-0/testdir/test_ensure_pytest_pilot_installed0 PLUGIN registered: <_pytest.config.PytestPluginManager object at 0x7f663dbc8128> PLUGIN registered: <_pytest.config.Config object at 0x7f663bd6ed68> PLUGIN registered: <module '_pytest.mark' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/mark.py'> PLUGIN registered: <module '_pytest.main' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/main.py'> PLUGIN registered: <module '_pytest.terminal' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/terminal.py'> PLUGIN registered: <module '_pytest.runner' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/runner.py'> PLUGIN registered: <module '_pytest.python' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/python.py'> PLUGIN registered: <module '_pytest.pdb' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/pdb.py'> PLUGIN registered: <module '_pytest.unittest' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/unittest.py'> PLUGIN registered: <module '_pytest.capture' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/capture.py'> PLUGIN registered: <module '_pytest.skipping' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/skipping.py'> PLUGIN registered: <module '_pytest.tmpdir' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/tmpdir.py'> PLUGIN registered: <module '_pytest.monkeypatch' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/monkeypatch.py'> PLUGIN registered: <module '_pytest.recwarn' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/recwarn.py'> PLUGIN registered: <module '_pytest.pastebin' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/pastebin.py'> PLUGIN registered: <module '_pytest.helpconfig' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/helpconfig.py'> PLUGIN registered: <module '_pytest.nose' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/nose.py'> PLUGIN registered: <module '_pytest.assertion' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/assertion/__init__.py'> PLUGIN registered: <module '_pytest.genscript' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/genscript.py'> PLUGIN registered: <module '_pytest.junitxml' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/junitxml.py'> PLUGIN registered: <module '_pytest.resultlog' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/resultlog.py'> PLUGIN registered: <module '_pytest.doctest' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/doctest.py'> PLUGIN registered: <module '_pytest.cacheprovider' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/cacheprovider.py'> PLUGIN registered: <module 'pytest_pilot.plugin' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_pilot/plugin.py'> PLUGIN registered: <module 'pytest_logging.plugin' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_logging/plugin.py'> PLUGIN registered: <module 'pytest_html.plugin' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_html/plugin.py'> PLUGIN registered: <module 'pytest_harvest.plugin' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_harvest/plugin.py'> PLUGIN registered: <_pytest.capture.CaptureManager object at 0x7f663acaf7f0> PLUGIN registered: <Session 'test_ensure_pytest_pilot_installed0'> PLUGIN registered: <_pytest.cacheprovider.LFPlugin object at 0x7f663acb8d30> PLUGIN registered: <_pytest.terminal.TerminalReporter object at 0x7f663ac872b0> PLUGIN registered: <_pytest.python.FixtureManager object at 0x7f663ac87a90> ============================= test session starts ============================== platform linux -- Python 3.5.6, pytest-2.9.2, py-1.8.0, pluggy-0.3.1 using: pytest-2.9.2 pylib-1.8.0 setuptools registered plugins: pytest-pilot-0.1.1.dev1+g29ce83d at /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_pilot/plugin.py pytest-logging-2015.11.4 at /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_logging/plugin.py pytest-html-1.9.0 at /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_html/plugin.py pytest-harvest-1.7.4 at /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_harvest/plugin.py active plugins: cacheprovider : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/cacheprovider.py unittest : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/unittest.py terminalreporter : <_pytest.terminal.TerminalReporter object at 0x7f663ac872b0> terminal : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/terminal.py 140077099155752 : <_pytest.config.PytestPluginManager object at 0x7f663dbc8128> harvest : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_harvest/plugin.py session : <Session 'test_ensure_pytest_pilot_installed0'> html : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_html/plugin.py helpconfig : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/helpconfig.py capture : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/capture.py pdb : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/pdb.py pastebin : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/pastebin.py assertion : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/assertion/__init__.py monkeypatch : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/monkeypatch.py doctest : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/doctest.py python : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/python.py pilot : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_pilot/plugin.py capturemanager : <_pytest.capture.CaptureManager object at 0x7f663acaf7f0> logging : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_logging/plugin.py lfplugin : <_pytest.cacheprovider.LFPlugin object at 0x7f663acb8d30> junitxml : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/junitxml.py nose : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/nose.py mark : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/mark.py recwarn : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/recwarn.py genscript : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/genscript.py funcmanage : <_pytest.python.FixtureManager object at 0x7f663ac87a90> main : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/main.py tmpdir : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/tmpdir.py pytestconfig : <_pytest.config.Config object at 0x7f663bd6ed68> runner : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/runner.py skipping : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/skipping.py resultlog : /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/resultlog.py rootdir: /tmp/pytest-of-travis/pytest-0/testdir/test_ensure_pytest_pilot_installed0, inifile: plugins: pilot-0.1.1.dev1+g29ce83d, logging-2015.11.4, html-1.9.0, harvest-1.7.4 collected 0 items ========================= no tests ran in 0.01 seconds ========================= ----------------------------- Captured stderr call -----------------------------
nomatch: '*pytest-pilot-*' and: 'PLUGIN registered: <_pytest.config.PytestPluginManager object at 0x7f663dbc8128>' and: 'PLUGIN registered: <_pytest.config.Config object at 0x7f663bd6ed68>' and: "PLUGIN registered: <module '_pytest.mark' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/mark.py'>" and: "PLUGIN registered: <module '_pytest.main' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/main.py'>" and: "PLUGIN registered: <module '_pytest.terminal' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/terminal.py'>" and: "PLUGIN registered: <module '_pytest.runner' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/runner.py'>" and: "PLUGIN registered: <module '_pytest.python' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/python.py'>" and: "PLUGIN registered: <module '_pytest.pdb' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/pdb.py'>" and: "PLUGIN registered: <module '_pytest.unittest' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/unittest.py'>" and: "PLUGIN registered: <module '_pytest.capture' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/capture.py'>" and: "PLUGIN registered: <module '_pytest.skipping' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/skipping.py'>" and: "PLUGIN registered: <module '_pytest.tmpdir' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/tmpdir.py'>" and: "PLUGIN registered: <module '_pytest.monkeypatch' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/monkeypatch.py'>" and: "PLUGIN registered: <module '_pytest.recwarn' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/recwarn.py'>" and: "PLUGIN registered: <module '_pytest.pastebin' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/pastebin.py'>" and: "PLUGIN registered: <module '_pytest.helpconfig' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/helpconfig.py'>" and: "PLUGIN registered: <module '_pytest.nose' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/nose.py'>" and: "PLUGIN registered: <module '_pytest.assertion' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/assertion/__init__.py'>" and: "PLUGIN registered: <module '_pytest.genscript' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/genscript.py'>" and: "PLUGIN registered: <module '_pytest.junitxml' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/junitxml.py'>" and: "PLUGIN registered: <module '_pytest.resultlog' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/resultlog.py'>" and: "PLUGIN registered: <module '_pytest.doctest' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/doctest.py'>" and: "PLUGIN registered: <module '_pytest.cacheprovider' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/cacheprovider.py'>" and: "PLUGIN registered: <module 'pytest_pilot.plugin' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_pilot/plugin.py'>" and: "PLUGIN registered: <module 'pytest_logging.plugin' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_logging/plugin.py'>" and: "PLUGIN registered: <module 'pytest_html.plugin' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_html/plugin.py'>" and: "PLUGIN registered: <module 'pytest_harvest.plugin' from '/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_harvest/plugin.py'>" and: 'PLUGIN registered: <_pytest.capture.CaptureManager object at 0x7f663acaf7f0>' and: "PLUGIN registered: <Session 'test_ensure_pytest_pilot_installed0'>" and: 'PLUGIN registered: <_pytest.cacheprovider.LFPlugin object at 0x7f663acb8d30>' and: 'PLUGIN registered: <_pytest.terminal.TerminalReporter object at 0x7f663ac872b0>' and: 'PLUGIN registered: <_pytest.python.FixtureManager object at 0x7f663ac87a90>' and: '============================= test session starts ==============================' and: 'platform linux -- Python 3.5.6, pytest-2.9.2, py-1.8.0, pluggy-0.3.1' and: 'using: pytest-2.9.2 pylib-1.8.0' and: 'setuptools registered plugins:' fnmatch: '*pytest-pilot-*' with: ' pytest-pilot-0.1.1.dev1+g29ce83d at /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_pilot/plugin.py'
Passed pytest_pilot/tests/test_main.py::test_basic_markers_help 0.27
----------------------------- Captured stdout call -----------------------------
using testdir <Testdir local('/tmp/pytest-of-travis/pytest-0/testdir/test_basic_markers_help0')> running: /home/travis/miniconda/envs/test-environment/bin/python /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest.py --basetemp=/tmp/pytest-of-travis/pytest-0/testdir/test_basic_markers_help0/runpytest-0 /tmp/pytest-of-travis/pytest-0/testdir/test_basic_markers_help0 --markers --basetemp=/tmp/pytest-of-travis/pytest-0/testdir/basetemp in: /tmp/pytest-of-travis/pytest-0/testdir/test_basic_markers_help0 @pytest.mark.a(value): mark test to run only when command option a is used to set --a to <value>, or if the option is not used at all. @pytest.mark.b(value): mark test to run only when command option bbb is used to set --b to <value>. @pytest.mark.skip(reason=None): skip the given test function with an optional reason. Example: skip(reason="no way of currently testing this") skips the test. @pytest.mark.skipif(condition): skip the given test function if eval(condition) results in a True value. Evaluation happens within the module global context. Example: skipif('sys.platform == "win32"') skips the test if we are on the win32 platform. see http://pytest.org/latest/skipping.html @pytest.mark.xfail(condition, reason=None, run=True, raises=None, strict=False): mark the the test function as an expected failure if eval(condition) has a True value. Optionally specify a reason for better reporting and run=False if you don't even want to execute the test function. If only specific exception(s) are expected, you can list them in raises, and if the test fails in other ways, it will be reported as a true failure. See http://pytest.org/latest/skipping.html @pytest.mark.parametrize(argnames, argvalues): call a test function multiple times passing in different arguments in turn. argvalues generally needs to be a list of values if argnames specifies only one name or a list of tuples of values if argnames specifies multiple names. Example: @parametrize('arg1', [1,2]) would lead to two calls of the decorated test function, one with arg1=1 and another with arg1=2.see http://pytest.org/latest/parametrize.html for more info and examples. @pytest.mark.usefixtures(fixturename1, fixturename2, ...): mark tests as needing all of the specified fixtures. see http://pytest.org/latest/fixture.html#usefixtures @pytest.mark.tryfirst: mark a hook implementation function such that the plugin machinery will try to call it first/as early as possible. @pytest.mark.trylast: mark a hook implementation function such that the plugin machinery will try to call it last/as late as possible. finalizing test dir <Testdir local('/tmp/pytest-of-travis/pytest-0/testdir/test_basic_markers_help0')> ----------------------------- Captured stderr call -----------------------------
exact match: '@pytest.mark.a(value): mark test to run only when command option a is used to set --a to <value>, or if the option is not used at all.' nomatch: '@pytest.mark.b(value): mark test to run only when command option bbb is used to set --b to <value>.' and: '' exact match: '@pytest.mark.b(value): mark test to run only when command option bbb is used to set --b to <value>.'
Passed pytest_pilot/tests/test_main.py::test_basic_options_help 0.28
----------------------------- Captured stdout call -----------------------------
using testdir <Testdir local('/tmp/pytest-of-travis/pytest-0/testdir/test_basic_options_help0')> running: /home/travis/miniconda/envs/test-environment/bin/python /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest.py --basetemp=/tmp/pytest-of-travis/pytest-0/testdir/test_basic_options_help0/runpytest-0 /tmp/pytest-of-travis/pytest-0/testdir/test_basic_options_help0 --help --basetemp=/tmp/pytest-of-travis/pytest-0/testdir/basetemp in: /tmp/pytest-of-travis/pytest-0/testdir/test_basic_options_help0 usage: pytest.py [options] [file_or_dir] [file_or_dir] [...] positional arguments: file_or_dir general: -k EXPRESSION only run tests which match the given substring expression. An expression is a python evaluatable expression where all names are substring-matched against test names and their parent classes. Example: -k 'test_method or test other' matches all test functions and classes whose name contains 'test_method' or 'test_other'. Additionally keywords are matched to classes and functions containing extra names in their 'extra_keyword_matches' set, as well as functions which have names assigned directly to them. -m MARKEXPR only run tests matching given mark expression. example: -m 'mark1 and not mark2'. --markers show markers (builtin, plugin and per-project ones). -x, --exitfirst exit instantly on first error or failed test. --maxfail=num exit after first num failures or errors. --strict run pytest in strict mode, warnings become errors. -c file load configuration from `file` instead of trying to locate one of the implicit configuration files. --fixtures, --funcargs show available fixtures, sorted by plugin appearance --import-mode={prepend,append} prepend/append to sys.path when importing test modules, default is to prepend. --pdb start the interactive Python debugger on errors. --capture=method per-test capturing method: one of fd|sys|no. -s shortcut for --capture=no. --runxfail run tests even if they are marked xfail --lf, --last-failed rerun only the tests that failed at the last run (or all if none failed) --ff, --failed-first run all tests but run the last failures first. This may re-order tests and thus lead to repeated fixture setup/teardown --cache-show show cache contents, don't perform collection or tests --cache-clear remove all cache contents at start of test run. reporting: -v, --verbose increase verbosity. -q, --quiet decrease verbosity. -r chars show extra test summary info as specified by chars (f)ailed, (E)error, (s)skipped, (x)failed, (X)passed (w)pytest-warnings (p)passed, (P)passed with output, (a)all except pP. -l, --showlocals show locals in tracebacks (disabled by default). --report=opts (deprecated, use -r) --tb=style traceback print mode (auto/long/short/line/native/no). --full-trace don't cut any tracebacks (default is to cut). --color=color color terminal output (yes/no/auto). --durations=N show N slowest setup/test durations (N=0 for all). --pastebin=mode send failed|all info to bpaste.net pastebin service. --junit-xml=path create junit-xml style report file at given path. --junit-prefix=str prepend prefix to classnames in junit-xml output --result-log=path path for machine-readable result log. --html=path create html report file at given path. collection: --collect-only only collect tests, don't execute them. --pyargs try to interpret all arguments as python packages. --ignore=path ignore path during collection (multi-allowed). --confcutdir=dir only load conftest.py's relative to specified dir. --noconftest Don't load any conftest.py files. --doctest-modules run doctests in all .py modules --doctest-glob=pat doctests file matching pattern, default: test*.txt --doctest-ignore-import-errors ignore doctest ImportErrors test session debugging and configuration: --basetemp=dir base temporary directory for this test run. --version display pytest lib version and import information. -h, --help show help message and configuration info -p name early-load given plugin (multi-allowed). To avoid loading of plugins, use the `no:` prefix, e.g. `no:doctest`. --trace-config trace considerations of conftest.py files. --debug store internal tracing debug information in 'pytestdebug.log'. --assert=MODE control assertion debugging tools. 'plain' performs no assertion debugging. 'reinterp' reinterprets assert statements after they failed to provide assertion expression information. 'rewrite' (the default) rewrites assert statements in test modules on import to provide assert expression information. --no-assert DEPRECATED equivalent to --assert=plain --no-magic DEPRECATED equivalent to --assert=plain --genscript=path create standalone pytest script at given target path. Logging Configuration: --logging-format=LOGGING_FORMAT log format as used by the logging module --logging-date-format=LOGGING_DATE_FORMAT log date format as used by the logging module custom options: --flavour=NAME run tests marked as requiring flavour NAME (marked with @flavour(NAME)), as well as tests not marked with @flavour. If you call `pytest` without this option, tests marked with @flavour will *all* be run --envid=NAME run tests marked as requiring environment NAME (marked with @envid(NAME)), as well as tests not marked with @envid. Important: if you call `pytest` without this option, tests marked with @envid will *not* be run. [pytest] ini-options in the next pytest.ini|tox.ini|setup.cfg file: markers (linelist) markers for test functions norecursedirs (args) directory patterns to avoid for recursion testpaths (args) directories to search for tests when no files or dire usefixtures (args) list of default fixtures to be used with this project python_files (args) glob-style file patterns for Python test module disco python_classes (args) prefixes or glob names for Python test class discover python_functions (args) prefixes or glob names for Python test function and m xfail_strict (bool) default for the strict parameter of xfail markers whe doctest_optionflags (args) option flags for doctests addopts (args) extra command line options minversion (string) minimally required pytest version logging_format (string) log format as used by the logging module logging_date_format (string) log date format as used by the logging module environment variables: PYTEST_ADDOPTS extra command line options PYTEST_PLUGINS comma-separated plugins to load during startup PYTEST_DEBUG set to enable debug tracing of pytest's internals to see available markers type: py.test --markers to see available fixtures type: py.test --fixtures (shown according to specified file_or_dir or current dir if not specified) finalizing test dir <Testdir local('/tmp/pytest-of-travis/pytest-0/testdir/test_basic_options_help0')> ----------------------------- Captured stderr call -----------------------------
nomatch: 'custom options:' and: 'usage: pytest.py [options] [file_or_dir] [file_or_dir] [...]' and: '' and: 'positional arguments:' and: ' file_or_dir' and: '' and: 'general:' and: ' -k EXPRESSION only run tests which match the given substring' and: ' expression. An expression is a python evaluatable' and: ' expression where all names are substring-matched' and: ' against test names and their parent classes. Example:' and: " -k 'test_method or test other' matches all test" and: ' functions and classes whose name contains' and: " 'test_method' or 'test_other'. Additionally keywords" and: ' are matched to classes and functions containing extra' and: " names in their 'extra_keyword_matches' set, as well as" and: ' functions which have names assigned directly to them.' and: ' -m MARKEXPR only run tests matching given mark expression.' and: " example: -m 'mark1 and not mark2'." and: ' --markers show markers (builtin, plugin and per-project ones).' and: ' -x, --exitfirst exit instantly on first error or failed test.' and: ' --maxfail=num exit after first num failures or errors.' and: ' --strict run pytest in strict mode, warnings become errors.' and: ' -c file load configuration from `file` instead of trying to' and: ' locate one of the implicit configuration files.' and: ' --fixtures, --funcargs' and: ' show available fixtures, sorted by plugin appearance' and: ' --import-mode={prepend,append}' and: ' prepend/append to sys.path when importing test' and: ' modules, default is to prepend.' and: ' --pdb start the interactive Python debugger on errors.' and: ' --capture=method per-test capturing method: one of fd|sys|no.' and: ' -s shortcut for --capture=no.' and: ' --runxfail run tests even if they are marked xfail' and: ' --lf, --last-failed rerun only the tests that failed at the last run (or' and: ' all if none failed)' and: ' --ff, --failed-first run all tests but run the last failures first. This' and: ' may re-order tests and thus lead to repeated fixture' and: ' setup/teardown' and: " --cache-show show cache contents, don't perform collection or tests" and: ' --cache-clear remove all cache contents at start of test run.' and: '' and: 'reporting:' and: ' -v, --verbose increase verbosity.' and: ' -q, --quiet decrease verbosity.' and: ' -r chars show extra test summary info as specified by chars' and: ' (f)ailed, (E)error, (s)skipped, (x)failed, (X)passed' and: ' (w)pytest-warnings (p)passed, (P)passed with output,' and: ' (a)all except pP.' and: ' -l, --showlocals show locals in tracebacks (disabled by default).' and: ' --report=opts (deprecated, use -r)' and: ' --tb=style traceback print mode (auto/long/short/line/native/no).' and: " --full-trace don't cut any tracebacks (default is to cut)." and: ' --color=color color terminal output (yes/no/auto).' and: ' --durations=N show N slowest setup/test durations (N=0 for all).' and: ' --pastebin=mode send failed|all info to bpaste.net pastebin service.' and: ' --junit-xml=path create junit-xml style report file at given path.' and: ' --junit-prefix=str prepend prefix to classnames in junit-xml output' and: ' --result-log=path path for machine-readable result log.' and: ' --html=path create html report file at given path.' and: '' and: 'collection:' and: " --collect-only only collect tests, don't execute them." and: ' --pyargs try to interpret all arguments as python packages.' and: ' --ignore=path ignore path during collection (multi-allowed).' and: " --confcutdir=dir only load conftest.py's relative to specified dir." and: " --noconftest Don't load any conftest.py files." and: ' --doctest-modules run doctests in all .py modules' and: ' --doctest-glob=pat doctests file matching pattern, default: test*.txt' and: ' --doctest-ignore-import-errors' and: ' ignore doctest ImportErrors' and: '' and: 'test session debugging and configuration:' and: ' --basetemp=dir base temporary directory for this test run.' and: ' --version display pytest lib version and import information.' and: ' -h, --help show help message and configuration info' and: ' -p name early-load given plugin (multi-allowed). To avoid' and: ' loading of plugins, use the `no:` prefix, e.g.' and: ' `no:doctest`.' and: ' --trace-config trace considerations of conftest.py files.' and: ' --debug store internal tracing debug information in' and: " 'pytestdebug.log'." and: " --assert=MODE control assertion debugging tools. 'plain' performs no" and: " assertion debugging. 'reinterp' reinterprets assert" and: ' statements after they failed to provide assertion' and: " expression information. 'rewrite' (the default)" and: ' rewrites assert statements in test modules on import' and: ' to provide assert expression information.' and: ' --no-assert DEPRECATED equivalent to --assert=plain' and: ' --no-magic DEPRECATED equivalent to --assert=plain' and: ' --genscript=path create standalone pytest script at given target path.' and: '' and: 'Logging Configuration:' and: ' --logging-format=LOGGING_FORMAT' and: ' log format as used by the logging module' and: ' --logging-date-format=LOGGING_DATE_FORMAT' and: ' log date format as used by the logging module' and: '' exact match: 'custom options:' exact match: ' --flavour=NAME run tests marked as requiring flavour NAME (marked' exact match: ' with @flavour(NAME)), as well as tests not marked with' exact match: ' @flavour. If you call `pytest` without this option,' exact match: ' tests marked with @flavour will *all* be run' exact match: ' --envid=NAME run tests marked as requiring environment NAME (marked' exact match: ' with @envid(NAME)), as well as tests not marked with' exact match: ' @envid. Important: if you call `pytest` without this' exact match: ' option, tests marked with @envid will *not* be run.'
Passed pytest_pilot/tests/test_main.py::test_basic_run_envquery 0.28
----------------------------- Captured stdout call -----------------------------
using testdir <Testdir local('/tmp/pytest-of-travis/pytest-0/testdir/test_basic_run_envquery0')> running: /home/travis/miniconda/envs/test-environment/bin/python /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest.py --basetemp=/tmp/pytest-of-travis/pytest-0/testdir/test_basic_run_envquery0/runpytest-0 /tmp/pytest-of-travis/pytest-0/testdir/test_basic_run_envquery0 -v -s --envid env1 --basetemp=/tmp/pytest-of-travis/pytest-0/testdir/basetemp in: /tmp/pytest-of-travis/pytest-0/testdir/test_basic_run_envquery0 ============================= test session starts ============================== platform linux -- Python 3.5.6, pytest-2.9.2, py-1.8.0, pluggy-0.3.1 -- /home/travis/miniconda/envs/test-environment/bin/python cachedir: .cache rootdir: /tmp/pytest-of-travis/pytest-0/testdir/test_basic_run_envquery0, inifile: plugins: pilot-0.1.1.dev1+g29ce83d, logging-2015.11.4, html-1.9.0, harvest-1.7.4 collecting ... collected 5 items test_basic_run_envquery.py::test_yellow_noenv PASSED test_basic_run_envquery.py::test_yellow_env1 PASSED test_basic_run_envquery.py::test_env2 SKIPPED test_basic_run_envquery.py::test_red_noenv PASSED test_basic_run_envquery.py::test_nomark PASSED ===================== 4 passed, 1 skipped in 0.01 seconds ====================== finalizing test dir <Testdir local('/tmp/pytest-of-travis/pytest-0/testdir/test_basic_run_envquery0')>
Passed pytest_pilot/tests/test_main.py::test_basic_run_flavourquery 0.28
----------------------------- Captured stdout call -----------------------------
using testdir <Testdir local('/tmp/pytest-of-travis/pytest-0/testdir/test_basic_run_flavourquery0')> running: /home/travis/miniconda/envs/test-environment/bin/python /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest.py --basetemp=/tmp/pytest-of-travis/pytest-0/testdir/test_basic_run_flavourquery0/runpytest-0 /tmp/pytest-of-travis/pytest-0/testdir/test_basic_run_flavourquery0 -v -s --flavour red --envid env2 --basetemp=/tmp/pytest-of-travis/pytest-0/testdir/basetemp in: /tmp/pytest-of-travis/pytest-0/testdir/test_basic_run_flavourquery0 ============================= test session starts ============================== platform linux -- Python 3.5.6, pytest-2.9.2, py-1.8.0, pluggy-0.3.1 -- /home/travis/miniconda/envs/test-environment/bin/python cachedir: .cache rootdir: /tmp/pytest-of-travis/pytest-0/testdir/test_basic_run_flavourquery0, inifile: plugins: pilot-0.1.1.dev1+g29ce83d, logging-2015.11.4, html-1.9.0, harvest-1.7.4 collecting ... collected 5 items test_basic_run_flavourquery.py::test_yellow_noenv SKIPPED test_basic_run_flavourquery.py::test_yellow_env1 SKIPPED test_basic_run_flavourquery.py::test_env2 PASSED test_basic_run_flavourquery.py::test_red_noenv PASSED test_basic_run_flavourquery.py::test_nomark PASSED ===================== 3 passed, 2 skipped in 0.01 seconds ====================== finalizing test dir <Testdir local('/tmp/pytest-of-travis/pytest-0/testdir/test_basic_run_flavourquery0')>
Passed pytest_pilot/tests/test_main.py::test_nameconflict 0.24
----------------------------- Captured stdout call -----------------------------
running: /home/travis/miniconda/envs/test-environment/bin/python /home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest.py --basetemp=/tmp/pytest-of-travis/pytest-0/testdir/test_nameconflict0/runpytest-0 /tmp/pytest-of-travis/pytest-0/testdir/test_nameconflict0 --basetemp=/tmp/pytest-of-travis/pytest-0/testdir/basetemp in: /tmp/pytest-of-travis/pytest-0/testdir/test_nameconflict0 ----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last): File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest.py", line 17, in <module> raise SystemExit(pytest.main()) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/config.py", line 39, in main config = _prepareconfig(args, plugins) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/config.py", line 118, in _prepareconfig pluginmanager=pluginmanager, args=args) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 724, in __call__ return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 338, in _hookexec return self._inner_hookexec(hook, methods, kwargs) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 333, in <lambda> _MultiCall(methods, kwargs, hook.spec_opts).execute() File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 595, in execute return _wrapped_call(hook_impl.function(*args), self.execute) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 249, in _wrapped_call wrap_controller.send(call_outcome) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/helpconfig.py", line 28, in pytest_cmdline_parse config = outcome.get_result() File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 278, in get_result raise ex[1].with_traceback(ex[2]) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 264, in __init__ self.result = func() File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 596, in execute res = hook_impl.function(*args) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/config.py", line 861, in pytest_cmdline_parse self.parse(args) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/config.py", line 966, in parse self._preparse(args, addopts=addopts) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/config.py", line 937, in _preparse args=args, parser=self._parser) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 724, in __call__ return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 338, in _hookexec return self._inner_hookexec(hook, methods, kwargs) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 333, in <lambda> _MultiCall(methods, kwargs, hook.spec_opts).execute() File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 595, in execute return _wrapped_call(hook_impl.function(*args), self.execute) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 249, in _wrapped_call wrap_controller.send(call_outcome) File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_pilot/plugin.py", line 72, in pytest_load_initial_conftests "name(s): %s" % (marker, conflicting)) ValueError: Error registering marker 'Pytest marker 'color' with commandline option '--color' and pytest mark '@pytest.mark.color(<color>)'': a command with this name already exists. Conflicting name(s): ['--color'] nomatch: "ValueError: Error registering marker 'Pytest marker 'color' with commandline option '--color' and pytest mark '@pytest.mark.color(<color>)'': a command with this name already exists. Conflicting name(s): ['--color']" and: 'Traceback (most recent call last):' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest.py", line 17, in <module>' and: ' raise SystemExit(pytest.main())' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/config.py", line 39, in main' and: ' config = _prepareconfig(args, plugins)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/config.py", line 118, in _prepareconfig' and: ' pluginmanager=pluginmanager, args=args)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 724, in __call__' and: ' return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 338, in _hookexec' and: ' return self._inner_hookexec(hook, methods, kwargs)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 333, in <lambda>' and: ' _MultiCall(methods, kwargs, hook.spec_opts).execute()' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 595, in execute' and: ' return _wrapped_call(hook_impl.function(*args), self.execute)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 249, in _wrapped_call' and: ' wrap_controller.send(call_outcome)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/helpconfig.py", line 28, in pytest_cmdline_parse' and: ' config = outcome.get_result()' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 278, in get_result' and: ' raise ex[1].with_traceback(ex[2])' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 264, in __init__' and: ' self.result = func()' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 596, in execute' and: ' res = hook_impl.function(*args)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/config.py", line 861, in pytest_cmdline_parse' and: ' self.parse(args)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/config.py", line 966, in parse' and: ' self._preparse(args, addopts=addopts)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/config.py", line 937, in _preparse' and: ' args=args, parser=self._parser)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 724, in __call__' and: ' return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 338, in _hookexec' and: ' return self._inner_hookexec(hook, methods, kwargs)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 333, in <lambda>' and: ' _MultiCall(methods, kwargs, hook.spec_opts).execute()' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 595, in execute' and: ' return _wrapped_call(hook_impl.function(*args), self.execute)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 249, in _wrapped_call' and: ' wrap_controller.send(call_outcome)' and: ' File "/home/travis/miniconda/envs/test-environment/lib/python3.5/site-packages/pytest_pilot/plugin.py", line 72, in pytest_load_initial_conftests' and: ' "name(s): %s" % (marker, conflicting))' exact match: "ValueError: Error registering marker 'Pytest marker 'color' with commandline option '--color' and pytest mark '@pytest.mark.color(<color>)'': a command with this name already exists. Conflicting name(s): ['--color']"