1 .. This work is licensed under a Creative Commons Attribution 4.0 International License.
2 .. SPDX-License-Identifier: CC-BY-4.0
11 Testing is one of the key activities in OPNFV and includes unit, feature,
12 component, system level testing for development, automated deployment,
13 performance characterization and stress testing.
15 Test projects are dedicated to provide frameworks, tooling and test-cases categorized as
16 functional, performance or compliance testing. Test projects fulfill different roles such as
17 verifying VIM functionality, benchmarking components and platforms or analysis of measured
18 KPIs for OPNFV release scenarios.
20 Feature projects also provide their own test suites that either run independently or within a
23 This document details the OPNFV testing ecosystem, describes common test components used
24 by individual OPNFV projects and provides links to project specific documentation.
27 The OPNFV Testing Ecosystem
28 ===========================
30 The OPNFV testing projects are represented in the following diagram:
32 .. figure:: ../../images/OPNFV_testing_working_group.png
34 :alt: Overview of OPNFV Testing projects
36 The major testing projects are described in the table below:
38 +----------------+---------------------------------------------------------+
39 | Project | Description |
40 +================+=========================================================+
41 | Bottlenecks | This project aims to find system bottlenecks by testing |
42 | | and verifying OPNFV infrastructure in a staging |
43 | | environment before committing it to a production |
44 | | environment. Instead of debugging a deployment in |
45 | | production environment, an automatic method for |
46 | | executing benchmarks which plans to validate the |
47 | | deployment during staging is adopted. This project |
48 | | forms a staging framework to find bottlenecks and to do |
49 | | analysis of the OPNFV infrastructure. |
50 +----------------+---------------------------------------------------------+
51 | CPerf | SDN Controller benchmarks and performance testing, |
52 | | applicable to controllers in general. Collaboration of |
53 | | upstream controller testing experts, external test tool |
54 | | developers and the standards community. Primarily |
55 | | contribute to upstream/external tooling, then add jobs |
56 | | to run those tools on OPNFV's infrastructure. |
57 +----------------+---------------------------------------------------------+
58 | Dovetail | This project intends to define and provide a set of |
59 | | OPNFV related validation criteria/tests that will |
60 | | provide input for the OPNFV Complaince Verification |
61 | | Program. The Dovetail project is executed with the |
62 | | guidance and oversight of the Complaince and |
63 | | Certification (C&C) committee and work to secure the |
64 | | goals of the C&C committee for each release. The |
65 | | project intends to incrementally define qualification |
66 | | criteria that establish the foundations of how one is |
67 | | able to measure the ability to utilize the OPNFV |
68 | | platform, how the platform itself should behave, and |
69 | | how applications may be deployed on the platform. |
70 +----------------+---------------------------------------------------------+
71 | Functest | This project deals with the functional testing of the |
72 | | VIM and NFVI. It leverages several upstream test suites |
73 | | (OpenStack, ODL, ONOS, etc.) and can be used by feature |
74 | | project to launch feature test suites in CI/CD. |
75 | | The project is used for scenario validation. |
76 +----------------+---------------------------------------------------------+
77 | NFVbench | NFVbench is a compact and self contained data plane |
78 | | performance measurement tool for OpensStack based NFVi |
79 | | platforms. It is agnostic of the NFVi distribution, |
80 | | Neutron networking implementation and hardware. |
81 | | It runs on any Linux server with a DPDK compliant |
82 | | NIC connected to the NFVi platform data plane and |
83 | | bundles a highly efficient software traffic generator. |
84 | | Provides a fully automated measurement of most common |
85 | | packet paths at any level of scale and load using |
86 | | RFC-2544. Available as a Docker container with simple |
87 | | command line and REST interfaces. |
88 | | Easy to use as it takes care of most of the guesswork |
89 | | generally associated to data plane benchmarking. |
90 | | Can run in any lab or in production environments. |
91 +----------------+---------------------------------------------------------+
92 | QTIP | QTIP as the project for "Platform Performance |
93 | | Benchmarking" in OPNFV aims to provide user a simple |
94 | | indicator for performance, supported by comprehensive |
95 | | testing data and transparent calculation formula. |
96 | | It provides a platform with common services for |
97 | | performance benchmarking which helps users to build |
98 | | indicators by themselves with ease. |
99 +----------------+---------------------------------------------------------+
100 | StorPerf | The purpose of this project is to provide a tool to |
101 | | measure block and object storage performance in an NFVI.|
102 | | When complemented with a characterization of typical VF |
103 | | storage performance requirements, it can provide |
104 | | pass/fail thresholds for test, staging, and production |
105 | | NFVI environments. |
106 +----------------+---------------------------------------------------------+
107 | VSPERF | VSPERF is an OPNFV project that provides an automated |
108 | | test-framework and comprehensive test suite based on |
109 | | Industry Test Specifications for measuring NFVI |
110 | | data-plane performance. The data-path includes switching|
111 | | technologies with physical and virtual network |
112 | | interfaces. The VSPERF architecture is switch and |
113 | | traffic generator agnostic and test cases can be easily |
114 | | customized. Software versions and configurations |
115 | | including the vSwitch (OVS or VPP) as well as the |
116 | | network topology are controlled by VSPERF (independent |
117 | | of OpenStack). VSPERF is used as a development tool for |
118 | | optimizing switching technologies, qualification of |
119 | | packet processing components and for pre-deployment |
120 | | evaluation of the NFV platform data-path. |
121 +----------------+---------------------------------------------------------+
122 | Yardstick | The goal of the Project is to verify the infrastructure |
123 | | compliance when running VNF applications. NFV Use Cases |
124 | | described in ETSI GS NFV 001 show a large variety of |
125 | | applications, each defining specific requirements and |
126 | | complex configuration on the underlying infrastructure |
127 | | and test tools.The Yardstick concept decomposes typical |
128 | | VNF work-load performance metrics into a number of |
129 | | characteristics/performance vectors, which each of them |
130 | | can be represented by distinct test-cases. |
131 +----------------+---------------------------------------------------------+
134 ===============================
135 Testing Working Group Resources
136 ===============================
138 Test Results Collection Framework
139 =================================
141 Any test project running in the global OPNFV lab infrastructure and is
142 integrated with OPNFV CI can push test results to the community Test Database
143 using a common Test API. This database can be used to track the evolution of
144 testing and analyse test runs to compare results across installers, scenarios
145 and between technically and geographically diverse hardware environments.
147 Results from the databse are used to generate a dashboard with the current test
148 status for each testing project. Please note that you can also deploy the Test
149 Database and Test API locally in your own environment.
151 Overall Test Architecture
152 -------------------------
154 The management of test results can be summarized as follows::
156 +-------------+ +-------------+ +-------------+
158 | Test | | Test | | Test |
159 | Project #1 | | Project #2 | | Project #N |
161 +-------------+ +-------------+ +-------------+
164 +---------------------------------------------+
166 | Test Rest API front end |
167 | http://testresults.opnfv.org/test |
169 +---------------------------------------------+
172 | +-------------------------+ |
174 | | Test Results DB | |
177 | +-------------------------+ |
180 +----------------------+ +----------------------+
182 | Testing Dashboards | | Test Landing page |
184 +----------------------+ +----------------------+
189 A Mongo DB Database was introduced for the Brahmaputra release.
190 The following collections are declared in this database:
191 * pods: the list of pods used for production CI
192 * projects: the list of projects providing test cases
193 * test cases: the test cases related to a given project
194 * results: the results of the test cases
195 * scenarios: the OPNFV scenarios tested in CI
197 This database can be used by any project through the Test API.
198 Please note that projects may also use additional databases. The Test
199 Database is mainly use to collect CI test results and generate scenario
200 trust indicators. The Test Database is also cloned for OPNFV Plugfests in
201 order to provide a private datastore only accessible to Plugfest participants.
206 The Test API is used to declare pods, projects, test cases and test results.
207 Pods correspond to a cluster of machines (3 controller and 2 compute nodes in
208 HA mode) used to run the tests and are defined in the Pharos project.
209 The results pushed in the database are related to pods, projects and test cases.
210 Trying to push results generated from a non-referenced pod will return an error
211 message by the Test API.
213 The data model is very basic, 5 objects are available:
220 For detailed information, please go to http://artifacts.opnfv.org/releng/docs/testapi.html
222 The code of the Test API is hosted in the releng-testresults repository `[TST2]`_.
223 The static documentation of the Test API can be found at `[TST3]`_.
224 The Test API has been dockerized and may be installed locally in your lab.
226 The deployment of the Test API has been automated.
227 A jenkins job manages:
229 * the unit tests of the Test API
230 * the creation of a new docker file
231 * the deployment of the new Test API
232 * the archive of the old Test API
233 * the backup of the Mongo DB
235 Test API Authorization
236 ----------------------
238 PUT/DELETE/POST operations of the TestAPI now require token based authorization. The token needs
239 to be added in the request using a header 'X-Auth-Token' for access to the database.
243 headers['X-Auth-Token']
245 The value of the header i.e the token can be accessed in the jenkins environment variable
246 *TestApiToken*. The token value is added as a masked password.
248 .. code-block:: python
250 headers['X-Auth-Token'] = os.environ.get('TestApiToken')
252 The above example is in Python. Token based authentication has been added so
253 that only CI pods running Jenkins jobs can access the database. Please note
254 that currently token authorization is implemented but is not yet enabled.
257 Test Project Reporting
258 ======================
259 The reporting page for the test projects is http://testresults.opnfv.org/reporting/
261 .. figure:: ../../images/reporting_page.png
263 :alt: Testing group reporting page
265 This page provides reporting per OPNFV release and per testing project.
267 .. figure:: ../../images/reportingMaster.png
269 :alt: Testing group Euphrates reporting page
271 An evolution of the reporting page is planned to unify test reporting by creating
272 a landing page that shows the scenario status in one glance (this information was
273 previously consolidated manually on a wiki page). The landing page will be displayed
274 per scenario and show:
276 * the status of the deployment
277 * the score from each test suite. There is no overall score, it is determined
278 by each test project.
284 Until the Colorado release, each testing project managed the list of its
285 test cases. This made it very hard to have a global view of the available test
286 cases from the different test projects. A common view was possible through the API
287 but it was not very user friendly.
288 Test cases per project may be listed by calling:
290 http://testresults.opnfv.org/test/api/v1/projects/<project_name>/cases
292 with project_name: bottlenecks, functest, qtip, storperf, vsperf, yardstick
294 A test case catalog has now been realized `[TST4]`_. Roll over the project then
295 click to get the list of test cases, and then click on the case to get more details.
297 .. figure:: ../../images/TestcaseCatalog.png
299 :alt: Testing group testcase catalog
304 The Test Dashboard is used to provide a consistent view of the results collected in CI.
305 The results shown on the dashboard are post processed from the Database, which only
306 contains raw results.
307 The dashboard can be used in addition to the reporting page (high level view) to allow
308 the creation of specific graphs according to what the test owner wants to show.
310 In Brahmaputra, a basic dashboard was created in Functest.
311 In Colorado, Yardstick used Grafana (time based graphs) and ELK (complex
313 Since Danube, the OPNFV testing community decided to adopt the ELK framework and to
314 use Bitergia for creating highly flexible dashboards `[TST5]`_.
316 .. figure:: ../../images/DashboardBitergia.png
318 :alt: Testing group testcase catalog
321 .. include:: ./energy-monitoring.rst
324 OPNFV Test Group Information
325 ============================
327 For more information or to participate in the OPNFV test community please see the
330 wiki: https://wiki.opnfv.org/testing
332 mailing list: test-wg@lists.opnfv.org
334 IRC channel: #opnfv-testperf
336 weekly meeting (https://wiki.opnfv.org/display/meetings/TestPerf):
337 * Usual time: Every Thursday 15:00-16:00 UTC / 7:00-8:00 PST
338 * APAC time: 2nd Wednesday of the month 8:00-9:00 UTC
341 =======================
342 Reference Documentation
343 =======================
345 +----------------+---------------------------------------------------------+
346 | Project | Documentation links |
347 +================+=========================================================+
348 | Bottlenecks | https://wiki.opnfv.org/display/bottlenecks/Bottlenecks |
349 +----------------+---------------------------------------------------------+
350 | CPerf | https://wiki.opnfv.org/display/cperf |
351 +----------------+---------------------------------------------------------+
352 | Dovetail | https://wiki.opnfv.org/display/dovetail |
353 +----------------+---------------------------------------------------------+
354 | Functest | https://wiki.opnfv.org/display/functest/ |
355 +----------------+---------------------------------------------------------+
356 | NFVbench | https://wiki.opnfv.org/display/nfvbench/ |
357 +----------------+---------------------------------------------------------+
358 | QTIP | https://wiki.opnfv.org/display/qtip |
359 +----------------+---------------------------------------------------------+
360 | StorPerf | https://wiki.opnfv.org/display/storperf/Storperf |
361 +----------------+---------------------------------------------------------+
362 | VSPERF | https://wiki.opnfv.org/display/vsperf |
363 +----------------+---------------------------------------------------------+
364 | Yardstick | https://wiki.opnfv.org/display/yardstick/Yardstick |
365 +----------------+---------------------------------------------------------+
368 `[TST1]`_: OPNFV web site
370 `[TST2]`_: TestAPI code repository link in releng-testresults
372 `[TST3]`_: TestAPI autogenerated documentation
374 `[TST4]`_: Testcase catalog
376 `[TST5]`_: Testing group dashboard
378 .. _`[TST1]`: http://www.opnfv.org
379 .. _`[TST2]`: https://git.opnfv.org/releng-testresults
380 .. _`[TST3]`: http://artifacts.opnfv.org/releng/docs/testapi.html
381 .. _`[TST4]`: http://testresults.opnfv.org/testing/index.html#!/select/visual
382 .. _`[TST5]`: https://opnfv.biterg.io:443/goto/283dba93ca18e95964f852c63af1d1ba