1 .. This work is licensed under a Creative Commons Attribution 4.0 International License.
2 .. SPDX-License-Identifier: CC-BY-4.0
4 ***********************
5 Testing developer guide
6 ***********************
17 The OPNFV testing ecosystem is wide.
19 The goal of this guide consists in providing some guidelines for new developers
20 involved in test areas.
22 For the description of the ecosystem, see `[DEV1]`_.
28 There are several ways to join test projects as a developer. In fact you may:
30 * Develop new test cases
32 * Develop tooling (reporting, dashboards, graphs, middleware,...)
33 * Troubleshoot results
34 * Post-process results
36 These different tasks may be done within a specific project or as a shared
37 resource accross the different projects.
39 If you develop new test cases, the best practice is to contribute upstream as
40 much as possible. You may contact the testing group to know which project - in
41 OPNFV or upstream - would be the best place to host the test cases. Such
42 contributions are usually directly connected to a specific project, more details
43 can be found in the user guides of the testing projects.
45 Each OPNFV testing project provides test cases and the framework to manage them.
46 As a developer, you can obviously contribute to them. The developer guide of
47 the testing projects shall indicate the procedure to follow.
49 Tooling may be specific to a project or generic to all the projects. For
50 specific tooling, please report to the test project user guide. The tooling used
51 by several test projects will be detailed in this document.
53 The best event to meet the testing community is probably the plugfest. Such an
54 event is organized after each release. Most of the test projects are present.
56 The summit is also a good opportunity to meet most of the actors `[DEV4]`_.
58 Be involved in the testing group
59 ================================
61 The testing group is a self organized working group. The OPNFV projects dealing
62 with testing are invited to participate in order to elaborate and consolidate a
63 consistant test strategy (test case definition, scope of projects, resources for
64 long duration, documentation, ...) and align tooling or best practices.
66 A weekly meeting is organized, the agenda may be amended by any participant.
67 2 slots have been defined (US/Europe and APAC). Agendas and minutes are public.
68 See `[DEV3]`_ for details.
69 The testing group IRC channel is #opnfv-testperf
74 All the test projects do not have the same maturity and/or number of
75 contributors. The nature of the test projects may be also different. The
76 following best practices may not be acurate for all the projects and are only
77 indicative. Contact the testing group for further details.
83 Most of the projects have a similar structure, which can be defined as follows::
97 | `-- Dockerfile.aarch64.patch
112 Test projects are installing tools and triggering tests. When it is possible it
113 is recommended to implement an API in order to perform the different actions.
115 Each test project should be able to expose and consume APIs from other test
116 projects. This pseudo micro service approach should allow a flexible use of
117 the different projects and reduce the risk of overlapping. In fact if project A
118 provides an API to deploy a traffic generator, it is better to reuse it rather
119 than implementing a new way to deploy it. This approach has not been implemented
120 yet but the prerequisites consiting in exposing and API has already been done by
121 several test projects.
126 Most of the test projects provide a docker as deliverable. Once connected, it is
127 possible to prepare the environement and run tests through a CLI.
132 Dockerization has been introduced in Brahmaputra and adopted by most of the test
133 projects. Docker containers are pulled on the jumphost of OPNFV POD.
134 <TODO Jose/Mark/Alec>
139 It is recommended to control the quality of the code of the testing projects,
140 and more precisely to implement some verifications before any merge:
143 * unit tests (python 2.7)
144 * unit tests (python 3.5)
147 The code of the test project must be covered by unit tests. The coverage
148 shall be reasonable and not decrease when adding new features to the framework.
149 The use of tox is recommended.
150 It is possible to implement strict rules (no decrease of pylint score, unit
151 test coverages) on critical python classes.
157 Several test projects integrate third party tooling for code quality check
158 and/or traffic generation. Some of the tools can be listed as follows:
160 +---------------+----------------------+------------------------------------+
161 | Project | Tool | Comments |
162 +===============+======================+====================================+
163 | Bottlenecks | TODO | |
164 +---------------+----------------------+------------------------------------+
165 | Functest | Tempest | OpenStack test tooling |
166 | | Rally | OpenStack test tooling |
167 | | Refstack | OpenStack test tooling |
168 | | RobotFramework | Used for ODL tests |
169 +---------------+----------------------+------------------------------------+
170 | QTIP | Unixbench | |
175 +---------------+----------------------+------------------------------------+
176 | Storperf | TODO | |
177 +---------------+----------------------+------------------------------------+
179 +---------------+----------------------+------------------------------------+
180 | Yardstick | Moongen | Traffic generator |
181 | | Trex | Traffic generator |
182 | | Pktgen | Traffic generator |
183 | | IxLoad, IxNet | Traffic generator |
185 | | Unixbench | Compute |
186 | | RAMSpeed | Compute |
187 | | LMBench | Compute |
188 | | Iperf3 | Network |
189 | | Netperf | Network |
190 | | Pktgen-DPDK | Network |
191 | | Testpmd | Network |
192 | | L2fwd | Network |
194 | | Bonnie++ | Storage |
195 +---------------+----------------------+------------------------------------+
198 ======================================
199 Testing group configuration parameters
200 ======================================
205 The testing group defined several categories also known as tiers. These
206 categories can be used to group test suites.
208 +----------------+-------------------------------------------------------------+
209 | Category | Description |
210 +================+=============================================================+
211 | Healthcheck | Simple and quick healthcheck tests case |
212 +----------------+-------------------------------------------------------------+
213 | Smoke | Set of smoke test cases/suites to validate the release |
214 +----------------+-------------------------------------------------------------+
215 | Features | Test cases that validate a specific feature on top of OPNFV.|
216 | | Those come from Feature projects and need a bit of support |
217 | | for integration |
218 +----------------+-------------------------------------------------------------+
219 | Components | Tests on a specific component (e.g. OpenStack, OVS, DPDK,..)|
220 | | It may extend smoke tests |
221 +----------------+-------------------------------------------------------------+
222 | Performance | Performance qualification |
223 +----------------+-------------------------------------------------------------+
224 | VNF | Test cases related to deploy an open source VNF including |
225 | | an orchestrator |
226 +----------------+-------------------------------------------------------------+
227 | Stress | Stress and robustness tests |
228 +----------------+-------------------------------------------------------------+
229 | In Service | In service testing |
230 +----------------+-------------------------------------------------------------+
235 The domains deal with the technical scope of the tests. It shall correspond to
236 domains defined for the certification program:
250 One of the goals of the testing working group is to identify the poorly covered
251 areas and avoid testing overlap.
252 Ideally based on the declaration of the test cases, through the tags, domains
253 and tier fields, it shall be possible to create heuristic maps.
260 Where can I find information on the different test projects?
261 ===========================================================
262 On http://docs.opnfv.org! A section is dedicated to the testing projects. You
263 will find the overview of the ecosystem and the links to the project documents.
265 Another source is the testing wiki on https://wiki.opnfv.org/display/testing
267 You may also contact the testing group on the IRC channel #opnfv-testperf or by
268 mail at test-wg AT lists.opnfv.org (testing group) or opnfv-tech-discuss AT
269 lists.opnfv.org (generic technical discussions).
272 How can I contribute to a test project?
273 =======================================
274 As any project, the best solution is to contact the project. The project
275 members with their email address can be found under
276 https://git.opnfv.org/<project>/tree/INFO
278 You may also send a mail to the testing mailing list or use the IRC channel
282 Where can I find hardware resources?
283 ====================================
284 You should discuss this topic with the project you are working with. If you need
285 access to an OPNFV community POD, it is possible to contact the infrastructure
286 group. Depending on your needs (scenario/installer/tooling), it should be
287 possible to find free time slots on one OPNFV community POD from the Pharos
288 federation. Create a JIRA ticket to describe your needs on
289 https://jira.opnfv.org/projects/INFRA.
290 You must already be an OPNFV contributor. See
291 https://wiki.opnfv.org/display/DEV/Developer+Getting+Started.
293 Please note that lots of projects have their own "how to contribute" or
294 "get started" page on the OPNFV wiki.
297 How do I integrate my tests in CI?
298 ==================================
299 It shall be discussed directly with the project you are working with. It is
300 done through jenkins jobs calling testing project files but the way to onboard
301 cases differ from one project to another.
303 How to declare my tests in the test Database?
304 =============================================
305 If you have access to the test API swagger (access granted to contributors), you
306 may use the swagger interface of the test API to declare your project.
307 The URL is http://testresults.opnfv.org/test/swagger/spec.html.
309 .. figure:: ../../../images/swaggerUI.png
311 :alt: Testing Group Test API swagger
313 Click on *Spec*, the list of available methods must be displayed.
315 .. figure:: ../../../images/API-operations.png
317 :alt: Testing Group Test API swagger
319 For the declaration of a new project use the POST /api/v1/projects method.
320 For the declaration of new test cases in an existing project, use the POST
321 /api/v1/projects/{project_name}/cases method
323 .. figure:: ../../../images/CreateCase.png
325 :alt: Testing group declare new test case
327 How to push your results into the Test Database?
328 ================================================
330 The test database is used to collect test results. By default it is
331 enabled only for CI tests from Production CI pods.
333 Please note that it is possible to create your own local database.
335 A dedicated database is for instance created for each plugfest.
337 The architecture and associated API is described in previous chapter.
338 If you want to push your results from CI, you just have to call the API
339 at the end of your script.
341 You can also reuse a python function defined in functest_utils.py `[DEV2]`_
344 Where can I find the documentation on the test API?
345 ===================================================
347 The Test API is now documented in this document (see sections above).
348 You may also find autogenerated documentation in
349 http://artifacts.opnfv.org/releng/docs/testapi.html
350 A web protal is also under construction for certification at
351 http://testresults.opnfv.org/test/#/
353 I have tests, to which category should I declare them?
354 ======================================================
357 The main ambiguity could be between features and VNF.
358 In fact sometimes you have to spawn VMs to demonstrate the capabilities of the
359 feature you introduced.
360 We recommend to declare your test in the feature category.
362 VNF category is really dedicated to test including:
364 * creation of resources
365 * deployement of an orchestrator/VNFM
366 * deployment of the VNF
370 The goal is not to study a particular feature on the infrastructure but to have
371 a whole end to end test of a VNF automatically deployed in CI.
372 Moreover VNF are run in weekly jobs (one a week), feature tests are in daily
373 jobs and use to get a scenario score.
375 Where are the logs of CI runs?
376 ==============================
378 Logs and configuration files can be pushed to artifact server from the CI under
379 http://artifacts.opnfv.org/<project name>
386 `[DEV1]`_: OPNFV Testing Ecosystem
388 `[DEV2]`_: Python code sample to push results into the Database
390 `[DEV3]`_: Testing group wiki page
392 `[DEV4]`_: Conversation with the testing community, OPNFV Beijing Summit
394 .. _`[DEV1]`: http://docs.opnfv.org/en/latest/testing/ecosystem/index.html
395 .. _`[DEV2]`: https://git.opnfv.org/functest/tree/functest/utils/functest_utils.py#176
396 .. _`[DEV3]`: https://wiki.opnfv.org/display/meetings/Test+Working+Group+Weekly+Meeting
397 .. _`[DEV4]`: https://www.youtube.com/watch?v=f9VAUdEqHoA
399 IRC support chan: #opnfv-testperf