X-Git-Url: https://gerrit.opnfv.org/gerrit/gitweb?a=blobdiff_plain;f=docs%2Ftesting%2Fecosystem%2Foverview.rst;h=50569a734a1e5185137b7e206b594a72d85d0e5c;hb=b9fcb793078d85577827ee7d886a4d9f4e9c150a;hp=a895d1ccc27e2fd279ce7a48b308997408ff5c2f;hpb=b54f4712cea42b96d5a6c5a63066119f6e90470e;p=opnfvdocs.git diff --git a/docs/testing/ecosystem/overview.rst b/docs/testing/ecosystem/overview.rst index a895d1ccc..50569a734 100644 --- a/docs/testing/ecosystem/overview.rst +++ b/docs/testing/ecosystem/overview.rst @@ -1,21 +1,21 @@ .. This work is licensed under a Creative Commons Attribution 4.0 International License. .. SPDX-License-Identifier: CC-BY-4.0 -============= -OPNFV testing -============= +====================== +OPNFV Testing Overview +====================== Introduction ============ -Testing is one of the key activities in OPNFV and includes unit, feature, component, system -level testing for development, automated deployment, performance characterization or stress -testing. +Testing is one of the key activities in OPNFV and includes unit, feature, +component, system level testing for development, automated deployment, +performance characterization and stress testing. -Test projects are dedicated to provide frameworks, tooling and test-cases catagorized as +Test projects are dedicated to provide frameworks, tooling and test-cases categorized as functional, performance or compliance testing. Test projects fulfill different roles such as verifying VIM functionality, benchmarking components and platforms or analysis of measured -KPIs for the scenarios released in OPNFV. +KPIs for OPNFV release scenarios. Feature projects also provide their own test suites that either run independently or within a test project. @@ -24,15 +24,12 @@ This document details the OPNFV testing ecosystem, describes common test compone by individual OPNFV projects and provides links to project specific documentation. -OPNFV testing ecosystem -======================= - -The testing projects --------------------- +The OPNFV Testing Ecosystem +=========================== -The OPNFV testing projects may be summarized as follows: +The OPNFV testing projects are represented in the following diagram: -.. figure:: https://wiki.opnfv.org/download/attachments/8688867/EcoSystem%20Copy.png +.. figure:: ../../images/OPNFV_testing_working_group.png :align: center :alt: Overview of OPNFV Testing projects @@ -41,7 +38,7 @@ The major testing projects are described in the table below: +----------------+---------------------------------------------------------+ | Project | Description | +================+=========================================================+ -| Bottlenecks | This project aims to find system bottlenecks by testing | +| Bottlenecks | This project aims to find system bottlenecks by testing | | | and verifying OPNFV infrastructure in a staging | | | environment before committing it to a production | | | environment. Instead of debugging a deployment in | @@ -52,20 +49,24 @@ The major testing projects are described in the table below: | | analysis of the OPNFV infrastructure. | +----------------+---------------------------------------------------------+ | CPerf | SDN Controller benchmarks and performance testing, | -| | applicable to controllers in general. | +| | applicable to controllers in general. Collaboration of | +| | upstream controller testing experts, external test tool | +| | developers and the standards community. Primarily | +| | contribute to upstream/external tooling, then add jobs | +| | to run those tools on OPNFV's infrastructure. | +----------------+---------------------------------------------------------+ | Dovetail | This project intends to define and provide a set of | -| | OPNFV related validation criteria that will provide | -| | input for the evaluation of the use of OPNFV trademarks.| -| | The dovetail project is executed with the guidance and | -| | oversight of the Compliance and Certification committee | -| | and work to secure the goals of the C&C committee for | -| | each release. The project intends to incrementally | -| | define qualification criteria that establish the | -| | foundations of how we are able to measure the ability to| -| | utilize the OPNFV platform, how the platform itself | -| | should behave, and how applications may be deployed on | -| | the platform. | +| | OPNFV related validation criteria/tests that will | +| | provide input for the OPNFV Complaince Verification | +| | Program. The Dovetail project is executed with the | +| | guidance and oversight of the Complaince and | +| | Certification (C&C) committee and work to secure the | +| | goals of the C&C committee for each release. The | +| | project intends to incrementally define qualification | +| | criteria that establish the foundations of how one is | +| | able to measure the ability to utilize the OPNFV | +| | platform, how the platform itself should behave, and | +| | how applications may be deployed on the platform. | +----------------+---------------------------------------------------------+ | Functest | This project deals with the functional testing of the | | | VIM and NFVI. It leverages several upstream test suites | @@ -73,7 +74,22 @@ The major testing projects are described in the table below: | | project to launch feature test suites in CI/CD. | | | The project is used for scenario validation. | +----------------+---------------------------------------------------------+ -| Qtip | QTIP as the project for "Platform Performance | +| NFVbench | NFVbench is a compact and self contained data plane | +| | performance measurement tool for OpensStack based NFVi | +| | platforms. It is agnostic of the NFVi distribution, | +| | Neutron networking implementation and hardware. | +| | It runs on any Linux server with a DPDK compliant | +| | NIC connected to the NFVi platform data plane and | +| | bundles a highly efficient software traffic generator. | +| | Provides a fully automated measurement of most common | +| | packet paths at any level of scale and load using | +| | RFC-2544. Available as a Docker container with simple | +| | command line and REST interfaces. | +| | Easy to use as it takes care of most of the guesswork | +| | generally associated to data plane benchmarking. | +| | Can run in any lab or in production environments. | ++----------------+---------------------------------------------------------+ +| QTIP | QTIP as the project for "Platform Performance | | | Benchmarking" in OPNFV aims to provide user a simple | | | indicator for performance, supported by comprehensive | | | testing data and transparent calculation formula. | @@ -81,20 +97,27 @@ The major testing projects are described in the table below: | | performance benchmarking which helps users to build | | | indicators by themselves with ease. | +----------------+---------------------------------------------------------+ -| Storperf | The purpose of this project is to provide a tool to | +| StorPerf | The purpose of this project is to provide a tool to | | | measure block and object storage performance in an NFVI.| | | When complemented with a characterization of typical VF | | | storage performance requirements, it can provide | | | pass/fail thresholds for test, staging, and production | | | NFVI environments. | +----------------+---------------------------------------------------------+ -| VSperf | This project provides a framework for automation of NFV | -| | data-plane performance testing and benchmarking. The | -| | NFVI fast-path includes switch technology and network | -| | with physical and virtual interfaces. VSperf can be | -| | used to evaluate the suitability of different Switch | -| | implementations and features, quantify data-path | -| | performance and optimize platform configurations. | +| VSPERF | VSPERF is an OPNFV project that provides an automated | +| | test-framework and comprehensive test suite based on | +| | Industry Test Specifications for measuring NFVI | +| | data-plane performance. The data-path includes switching| +| | technologies with physical and virtual network | +| | interfaces. The VSPERF architecture is switch and | +| | traffic generator agnostic and test cases can be easily | +| | customized. Software versions and configurations | +| | including the vSwitch (OVS or VPP) as well as the | +| | network topology are controlled by VSPERF (independent | +| | of OpenStack). VSPERF is used as a development tool for | +| | optimizing switching technologies, qualification of | +| | packet processing components and for pre-deployment | +| | evaluation of the NFV platform data-path. | +----------------+---------------------------------------------------------+ | Yardstick | The goal of the Project is to verify the infrastructure | | | compliance when running VNF applications. NFV Use Cases | @@ -108,16 +131,27 @@ The major testing projects are described in the table below: +----------------+---------------------------------------------------------+ -=================================== -The testing working group resources -=================================== +=============================== +Testing Working Group Resources +=============================== -The assets -========== +Test Results Collection Framework +================================= -Overall Architecture --------------------- -The Test result management can be summarized as follows:: +Any test project running in the global OPNFV lab infrastructure and is +integrated with OPNFV CI can push test results to the community Test Database +using a common Test API. This database can be used to track the evolution of +testing and analyse test runs to compare results across installers, scenarios +and between technically and geographically diverse hardware environments. + +Results from the databse are used to generate a dashboard with the current test +status for each testing project. Please note that you can also deploy the Test +Database and Test API locally in your own environment. + +Overall Test Architecture +------------------------- + +The management of test results can be summarized as follows:: +-------------+ +-------------+ +-------------+ | | | | | | @@ -145,182 +179,173 @@ The Test result management can be summarized as follows:: | | +----------------------+ +----------------------+ | | | | - | Testing Dashboards | | Landing page | + | Testing Dashboards | | Test Landing page | | | | | +----------------------+ +----------------------+ -The testing databases ---------------------- -A Mongo DB Database has been introduced for the Brahmaputra release. +The Test Database +----------------- +A Mongo DB Database was introduced for the Brahmaputra release. The following collections are declared in this database: * pods: the list of pods used for production CI * projects: the list of projects providing test cases - * testcases: the test cases related to a given project + * test cases: the test cases related to a given project * results: the results of the test cases * scenarios: the OPNFV scenarios tested in CI -This database can be used by any project through the testapi. -Please note that projects may also use additional databases. This database is -mainly use to colelct CI results and scenario trust indicators. +This database can be used by any project through the Test API. +Please note that projects may also use additional databases. The Test +Database is mainly use to collect CI test results and generate scenario +trust indicators. The Test Database is also cloned for OPNFV Plugfests in +order to provide a private datastore only accessible to Plugfest participants. -This database is also cloned for OPNFV Plugfest. +Test API description +-------------------- +The Test API is used to declare pods, projects, test cases and test results. +Pods correspond to a cluster of machines (3 controller and 2 compute nodes in +HA mode) used to run the tests and are defined in the Pharos project. +The results pushed in the database are related to pods, projects and test cases. +Trying to push results generated from a non-referenced pod will return an error +message by the Test API. + +The data model is very basic, 5 objects are available: + * Pods + * Projects + * Test cases + * Results + * Scenarios -The test API ------------- -The Test API is used to declare pods, projects, test cases and test -results. Pods are the pods used to run the tests. -The results pushed in the database are related to pods, projects and -cases. If you try to push results of test done on non referenced pod, -the API will return an error message. +For detailed information, please go to http://artifacts.opnfv.org/releng/docs/testapi.html -An additional method dashboard has been added to post-process -the raw results in the Brahmaputra release (deprecated in Colorado release). +The code of the Test API is hosted in the releng repository `[TST2]`_. +The static documentation of the Test API can be found at `[TST3]`_. +The Test API has been dockerized and may be installed locally in your lab. -The data model is very basic, 4 objects are created: +The deployment of the Test API has been automated. +A jenkins job manages: - * Pods - * Projects - * Testcases - * Results + * the unit tests of the Test API + * the creation of a new docker file + * the deployment of the new Test API + * the archive of the old Test API + * the backup of the Mongo DB + +Test API Authorization +---------------------- + +PUT/DELETE/POST operations of the TestAPI now require token based authorization. The token needs +to be added in the request using a header 'X-Auth-Token' for access to the database. + +e.g:: + + headers['X-Auth-Token'] + +The value of the header i.e the token can be accessed in the jenkins environment variable +*TestApiToken*. The token value is added as a masked password. + +.. code-block:: python + + headers['X-Auth-Token'] = os.environ.get('TestApiToken') + +The above example is in Python. Token based authentication has been added so +that only CI pods running Jenkins jobs can access the database. Please note +that currently token authorization is implemented but is not yet enabled. + + +Test Project Reporting +====================== +The reporting page for the test projects is http://testresults.opnfv.org/reporting/ + +.. figure:: ../../images/reporting_page.png + :align: center + :alt: Testing group reporting page + +This page provides reporting per OPNFV release and per testing project. + +.. figure:: ../../images/reportingMaster.png + :align: center + :alt: Testing group Euphrates reporting page + +An evolution of the reporting page is planned to unify test reporting by creating +a landing page that shows the scenario status in one glance (this information was +previously consolidated manually on a wiki page). The landing page will be displayed +per scenario and show: -Pods:: - - { - "id": , - "details": , - "creation_date": "YYYY-MM-DD HH:MM:SS", - "name": , - "mode": , - "role": - }, - -Projects:: - - { - "id": , - "name": , - "creation_date": "YYYY-MM-DD HH:MM:SS", - "description": - }, - -Testcases:: - - { - "id": , - "name":, - "project_name":, - "creation_date": "YYYY-MM-DD HH:MM:SS", - "description": , - "url": - }, - -Results:: - - { - "_id": , - "case_name": , - "project_name": , - "pod_name": , - "installer": , - "version": , - "start_date": "YYYY-MM-DD HH:MM:SS", - "stop_date": "YYYY-MM-DD HH:MM:SS", - "build_tag": , - "scenario": , - "criteria": , - "trust_indicator": { - "current": 0, - "histories": [] - } - } - - Scenarios:: - - { - "id": , - "name":, - "name" : "os-odl_l2-nofeature-ha", - "installers":[ - { - "installer" : , - "versions": [ - { - "version": , - "owner": , - "custom list": { "projects": [{ - "functest" : [ "vping_ssh", "vping_userdata", "tempest_smoke_serial", "rally_sanity", "odl", "doctor"], - "yardstick" : [ "tc002","tc005","tc010","tc011","tc012","tc014","tc037","tc055","tc063","tc069","tc070","tc071","tc072","tc075"]}]}, - "score": { "projects": [{ - "functest" : [{"date": YYY-MM-DD HH:MM, "score":}, {"date": YYY-MM-DD HH:MM, "score":}, ...], - "yardstick" : [{"date": YYY-MM-DD HH:MM, "score":}, {"date": YYY-MM-DD HH:MM, "score":}, ...]}]}, - "trust_indicator": { "projects": [{ - "functest" : [{"date": YYY-MM-DD HH:MM,"status":}, {"date": YYY-MM-DD HH:MM,"status":},...], - "yardstick" : [{"date": YYY-MM-DD HH:MM,"status":}, {"date": YYY-MM-DD HH:MM,"status":},...]}]}}, - { .... - }, - -For detailed information, please go to - - http://testresults.opnfv.org/test/swagger/spec.html - - Authentication: opnfv/api@opnfv - -Please notes that POST/DELETE/PUT operations for test or study purpose via -swagger website is not allowed, because it will change the real data in -the database. - - -The reporting -------------- -Until the Colorado release, each testing project was reporting a status on a dedicated page. -It was decided to unify the reporting by creating a landing page that should give -the scenario status in one glance (it was previously consolidated manually -on a wiki page). The landing page will be display per scenario: * the status of the deployment - * the score of the test projectS + * the score from each test suite. There is no overall score, it is determined + by each test project. * a trust indicator -Additional filters (version, installer, test collection time window,... ) -This landing page has been dockerized. The back end relies on the testing DB. +Test Case Catalog +================= +Until the Colorado release, each testing project managed the list of its +test cases. This made it very hard to have a global view of the available test +cases from the different test projects. A common view was possible through the API +but it was not very user friendly. +Test cases per project may be listed by calling: - TODO: add picture + http://testresults.opnfv.org/test/api/v1/projects//cases -The test case catalog ----------------------- -Until the Colorado release, each testing project was managing the list of its test cases. It -was very hard to have a global view of the available test cases among the -different test projects. A common view was possible through the API but it was -not very user friendly. -It was decided to build a web site providing a consistent view of the test cases -per project and allow any scenario owner to build his/her custom list of tests. -The test catalog can be described as below:: +with project_name: bottlenecks, functest, qtip, storperf, vsperf, yardstick - TODO: add picture +A test case catalog has now been realized `[TST4]`_. Roll over the project then +click to get the list of test cases, and then click on the case to get more details. -Other resources +.. figure:: ../../images/TestcaseCatalog.png + :align: center + :alt: Testing group testcase catalog + +Test Dashboards =============== +The Test Dashboard is used to provide a consistent view of the results collected in CI. +The results shown on the dashboard are post processed from the Database, which only +contains raw results. +The dashboard can be used in addition to the reporting page (high level view) to allow +the creation of specific graphs according to what the test owner wants to show. + +In Brahmaputra, a basic dashboard was created in Functest. +In Colorado, Yardstick used Grafana (time based graphs) and ELK (complex +graphs). +Since Danube, the OPNFV testing community decided to adopt the ELK framework and to +use Bitergia for creating highly flexible dashboards `[TST5]`_. + +.. figure:: ../../images/DashboardBitergia.png + :align: center + :alt: Testing group testcase catalog + + +.. include:: ./energy-monitoring.rst + + +OPNFV Test Group Information +============================ + +For more information or to participate in the OPNFV test community please see the +following: + wiki: https://wiki.opnfv.org/testing mailing list: test-wg@lists.opnfv.org -IRC chan: #opnfv-testperf +IRC channel: #opnfv-testperf weekly meeting (https://wiki.opnfv.org/display/meetings/TestPerf): * Usual time: Every Thursday 15:00-16:00 UTC / 7:00-8:00 PST * APAC time: 2nd Wednesday of the month 8:00-9:00 UTC + ======================= -Reference documentation +Reference Documentation ======================= +----------------+---------------------------------------------------------+ | Project | Documentation links | +================+=========================================================+ -| Bottlenecks | https://wiki.opnfv.org/display/bottlenecks/Bottlenecks | +| Bottlenecks | https://wiki.opnfv.org/display/bottlenecks/Bottlenecks | +----------------+---------------------------------------------------------+ | CPerf | https://wiki.opnfv.org/display/cperf | +----------------+---------------------------------------------------------+ @@ -328,11 +353,30 @@ Reference documentation +----------------+---------------------------------------------------------+ | Functest | https://wiki.opnfv.org/display/functest/ | +----------------+---------------------------------------------------------+ -| Qtip | https://wiki.opnfv.org/display/qtip | +| NFVbench | https://wiki.opnfv.org/display/nfvbench/ | ++----------------+---------------------------------------------------------+ +| QTIP | https://wiki.opnfv.org/display/qtip | +----------------+---------------------------------------------------------+ -| Storperf | https://wiki.opnfv.org/display/storperf/Storperf | +| StorPerf | https://wiki.opnfv.org/display/storperf/Storperf | +----------------+---------------------------------------------------------+ -| VSperf | https://wiki.opnfv.org/display/vsperf | +| VSPERF | https://wiki.opnfv.org/display/vsperf | +----------------+---------------------------------------------------------+ | Yardstick | https://wiki.opnfv.org/display/yardstick/Yardstick | +----------------+---------------------------------------------------------+ + + +`[TST1]`_: OPNFV web site + +`[TST2]`_: Test utils in Releng + +`[TST3]`_: TestAPI autogenerated documentation + +`[TST4]`_: Testcase catalog + +`[TST5]`_: Testing group dashboard + +.. _`[TST1]`: http://www.opnfv.org +.. _`[TST2]`: https://git.opnfv.org/functest/tree/releng/utils/tests +.. _`[TST3]`: http://artifacts.opnfv.org/releng/docs/testapi.html +.. _`[TST4]`: http://testresults.opnfv.org/testing/index.html#!/select/visual +.. _`[TST5]`: https://opnfv.biterg.io:443/goto/283dba93ca18e95964f852c63af1d1ba