arm_compute v17.12
diff --git a/documentation/tests.xhtml b/documentation/tests.xhtml
index 19934e4..6367c0f 100644
--- a/documentation/tests.xhtml
+++ b/documentation/tests.xhtml
@@ -4,7 +4,7 @@
<head>
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
<meta http-equiv="X-UA-Compatible" content="IE=9"/>
-<meta name="generator" content="Doxygen 1.8.6"/>
+<meta name="generator" content="Doxygen 1.8.11"/>
<meta name="robots" content="NOINDEX, NOFOLLOW" /> <!-- Prevent indexing by search engines -->
<title>Compute Library: Validation and benchmarks tests</title>
<link href="tabs.css" rel="stylesheet" type="text/css"/>
@@ -12,22 +12,24 @@
<script type="text/javascript" src="dynsections.js"></script>
<link href="navtree.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="resize.js"></script>
+<script type="text/javascript" src="navtreedata.js"></script>
<script type="text/javascript" src="navtree.js"></script>
<script type="text/javascript">
$(document).ready(initResizable);
$(window).load(resizeHeight);
</script>
<link href="search/search.css" rel="stylesheet" type="text/css"/>
+<script type="text/javascript" src="search/searchdata.js"></script>
<script type="text/javascript" src="search/search.js"></script>
<script type="text/javascript">
- $(document).ready(function() { searchBox.OnSelectItem(0); });
+ $(document).ready(function() { init_search(); });
</script>
<script type="text/x-mathjax-config">
MathJax.Hub.Config({
extensions: ["tex2jax.js"],
jax: ["input/TeX","output/HTML-CSS"],
});
-</script><script src="http://cdn.mathjax.org/mathjax/latest/MathJax.js"></script>
+</script><script type="text/javascript" src="http://cdn.mathjax.org/mathjax/latest/MathJax.js"></script>
<link href="doxygen.css" rel="stylesheet" type="text/css" />
</head>
<body>
@@ -38,7 +40,7 @@
<tr style="height: 56px;">
<td style="padding-left: 0.5em;">
<div id="projectname">Compute Library
-  <span id="projectnumber">17.10</span>
+  <span id="projectnumber">17.12</span>
</div>
</td>
</tr>
@@ -46,7 +48,7 @@
</table>
</div>
<!-- end header part -->
-<!-- Generated by Doxygen 1.8.6 -->
+<!-- Generated by Doxygen 1.8.11 -->
<script type="text/javascript">
var searchBox = new SearchBox("searchBox", "search",false,'Search');
</script>
@@ -95,7 +97,7 @@
onmouseover="return searchBox.OnSearchSelectShow()"
onmouseout="return searchBox.OnSearchSelectHide()"
onkeydown="return searchBox.OnSearchSelectKey(event)">
-<a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(0)"><span class="SelectionMark"> </span>All</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(1)"><span class="SelectionMark"> </span>Data Structures</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(2)"><span class="SelectionMark"> </span>Namespaces</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(3)"><span class="SelectionMark"> </span>Files</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(4)"><span class="SelectionMark"> </span>Functions</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(5)"><span class="SelectionMark"> </span>Variables</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(6)"><span class="SelectionMark"> </span>Typedefs</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(7)"><span class="SelectionMark"> </span>Enumerations</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(8)"><span class="SelectionMark"> </span>Enumerator</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(9)"><span class="SelectionMark"> </span>Friends</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(10)"><span class="SelectionMark"> </span>Macros</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(11)"><span class="SelectionMark"> </span>Pages</a></div>
+</div>
<!-- iframe showing the search results (closed by default) -->
<div id="MSearchResultsWindow">
@@ -110,7 +112,8 @@
</div><!--header-->
<div class="contents">
<div class="toc"><h3>Table of Contents</h3>
-<ul><li class="level1"><a href="#tests_overview">Overview</a><ul><li class="level2"><a href="#tests_overview_fixtures">Fixtures</a><ul><li class="level3"><a href="#tests_overview_fixtures_fixture">Fixture</a></li>
+<ul><li class="level1"><a href="#tests_overview">Overview</a><ul><li class="level2"><a href="#tests_overview_structure">Directory structure</a></li>
+<li class="level2"><a href="#tests_overview_fixtures">Fixtures</a><ul><li class="level3"><a href="#tests_overview_fixtures_fixture">Fixture</a></li>
<li class="level3"><a href="#tests_overview_fixtures_data_fixture">Data fixture</a></li>
</ul>
</li>
@@ -132,7 +135,6 @@
<li class="level3"><a href="#tests_running_tests_benchmarking_instruments">Instruments</a></li>
</ul>
</li>
-<li class="level2"><a href="#tests_running_tests_validation">Validation</a></li>
</ul>
</li>
</ul>
@@ -140,14 +142,34 @@
<div class="textblock"><h1><a class="anchor" id="tests_overview"></a>
Overview</h1>
<p>Benchmark and validation tests are based on the same framework to setup and run the tests. In addition to running simple, self-contained test functions the framework supports fixtures and data test cases. The former allows to share common setup routines between various backends thus reducing the amount of duplicated code. The latter can be used to parameterize tests or fixtures with different inputs, e.g. different tensor shapes. One limitation is that tests/fixtures cannot be parameterized based on the data type if static type information is needed within the test (e.g. to validate the results).</p>
-<h2><a class="anchor" id="tests_overview_fixtures"></a>
+<h2><a class="anchor" id="tests_overview_structure"></a>
+Directory structure</h2>
+<pre class="fragment">.
+`-- tests <- Top level test directory. All files in here are shared among validation and benchmark.
+ |-- framework <- Underlying test framework.
+ |-- CL \
+ |-- NEON -> Backend specific files with helper functions etc.
+ |-- benchmark <- Top level directory for the benchmarking files.
+ | |-- fixtures <- Fixtures for benchmark tests.
+ | |-- CL <- OpenCL backend test cases on a function level.
+ | | `-- SYSTEM <- OpenCL system tests, e.g. whole networks
+ | `-- NEON <- Same for NEON
+ | `-- SYSTEM
+ |-- datasets <- Datasets for benchmark and validation tests.
+ |-- main.cpp <- Main entry point for the tests. Currently shared between validation and benchmarking.
+ |-- networks <- Network classes for system level tests.
+ `-- validation -> Top level directory for validation files.
+ |-- CPP -> C++ reference code
+ |-- CL \
+ |-- NEON -> Backend specific test cases
+ `-- fixtures -> Fixtures shared among all backends. Used to setup target function and tensors.
+</pre><h2><a class="anchor" id="tests_overview_fixtures"></a>
Fixtures</h2>
<p>Fixtures can be used to share common setup, teardown or even run tasks among multiple test cases. For that purpose a fixture can define a <code>setup</code>, <code>teardown</code> and <code>run</code> method. Additionally the constructor and destructor might also be customized.</p>
<p>An instance of the fixture is created immediately before the actual test is executed. After construction the <a class="el" href="classarm__compute_1_1test_1_1framework_1_1_fixture.xhtml#a4fc01d736fe50cf5b977f755b675f11d">framework::Fixture::setup</a> method is called. Then the test function or the fixtures <code>run</code> method is invoked. After test execution the <a class="el" href="classarm__compute_1_1test_1_1framework_1_1_fixture.xhtml#a4adab6322a0276f34a7d656d49fc865c">framework::Fixture::teardown</a> method is called and lastly the fixture is destructed.</p>
<h3><a class="anchor" id="tests_overview_fixtures_fixture"></a>
Fixture</h3>
-<p>Fixtures for non-parameterized test are straightforward. The custom fixture class has to inherit from <a class="el" href="classarm__compute_1_1test_1_1framework_1_1_fixture.xhtml">framework::Fixture</a> and choose to implement any of the <code>setup</code>, <code>teardown</code> or <code>run</code> methods. None of the methods takes any arguments or returns anything. </p>
-<pre class="fragment">class CustomFixture : public framework::Fixture
+<p>Fixtures for non-parameterized test are straightforward. The custom fixture class has to inherit from <a class="el" href="classarm__compute_1_1test_1_1framework_1_1_fixture.xhtml">framework::Fixture</a> and choose to implement any of the <code>setup</code>, <code>teardown</code> or <code>run</code> methods. None of the methods takes any arguments or returns anything. </p><pre class="fragment">class CustomFixture : public framework::Fixture
{
void setup()
{
@@ -168,8 +190,7 @@
};
</pre><h3><a class="anchor" id="tests_overview_fixtures_data_fixture"></a>
Data fixture</h3>
-<p>The advantage of a parameterized fixture is that arguments can be passed to the setup method at runtime. To make this possible the setup method has to be a template with a type parameter for every argument (though the template parameter doesn't have to be used). All other methods remain the same. </p>
-<pre class="fragment">class CustomFixture : public framework::Fixture
+<p>The advantage of a parameterized fixture is that arguments can be passed to the setup method at runtime. To make this possible the setup method has to be a template with a type parameter for every argument (though the template parameter doesn't have to be used). All other methods remain the same. </p><pre class="fragment">class CustomFixture : public framework::Fixture
{
#ifdef ALTERNATIVE_DECLARATION
template <typename ...>
@@ -342,15 +363,11 @@
Benchmarking</h2>
<h3><a class="anchor" id="tests_running_tests_benchmarking_filter"></a>
Filter tests</h3>
-<p>All tests can be run by invoking </p>
-<pre class="fragment">./arm_compute_benchmark ./data
+<p>All tests can be run by invoking </p><pre class="fragment">./arm_compute_benchmark ./data
</pre><p>where <code>./data</code> contains the assets needed by the tests.</p>
-<p>If only a subset of the tests has to be executed the <code>--filter</code> option takes a regular expression to select matching tests. </p>
-<pre class="fragment">./arm_compute_benchmark --filter='NEON/.*AlexNet' ./data
-</pre><p>Additionally each test has a test id which can be used as a filter, too. However, the test id is not guaranteed to be stable when new tests are added. Only for a specific build the same the test will keep its id. </p>
-<pre class="fragment">./arm_compute_benchmark --filter-id=10 ./data
-</pre><p>All available tests can be displayed with the <code>--list-tests</code> switch. </p>
-<pre class="fragment">./arm_compute_benchmark --list-tests
+<p>If only a subset of the tests has to be executed the <code>--filter</code> option takes a regular expression to select matching tests. </p><pre class="fragment">./arm_compute_benchmark --filter='NEON/.*AlexNet' ./data
+</pre><p>Additionally each test has a test id which can be used as a filter, too. However, the test id is not guaranteed to be stable when new tests are added. Only for a specific build the same the test will keep its id. </p><pre class="fragment">./arm_compute_benchmark --filter-id=10 ./data
+</pre><p>All available tests can be displayed with the <code>--list-tests</code> switch. </p><pre class="fragment">./arm_compute_benchmark --list-tests
</pre><p>More options can be found in the <code>--help</code> message.</p>
<h3><a class="anchor" id="tests_running_tests_benchmarking_runtime"></a>
Runtime</h3>
@@ -368,18 +385,15 @@
<p><code>MALI</code> will try to collect Mali hardware performance counters. (You need to have a recent enough Mali driver)</p>
<p><code>WALL_CLOCK</code> will measure time using <code>gettimeofday</code>: this should work on all platforms.</p>
<p>You can pass a combinations of these instruments: <code>--instruments=PMU,MALI,WALL_CLOCK</code></p>
-<dl class="section note"><dt>Note</dt><dd>You need to make sure the instruments have been selected at compile time using the <code>pmu=1</code> or <code>mali=1</code> scons options.</dd></dl>
-<h2><a class="anchor" id="tests_running_tests_validation"></a>
-Validation</h2>
-<dl class="section note"><dt>Note</dt><dd>The new validation tests have the same interface as the benchmarking tests. </dd></dl>
+<dl class="section note"><dt>Note</dt><dd>You need to make sure the instruments have been selected at compile time using the <code>pmu=1</code> or <code>mali=1</code> scons options. </dd></dl>
</div></div><!-- contents -->
</div><!-- doc-content -->
<!-- start footer part -->
<div id="nav-path" class="navpath"><!-- id is needed for treeview function! -->
<ul>
- <li class="footer">Generated on Thu Oct 12 2017 14:26:35 for Compute Library by
+ <li class="footer">Generated on Thu Dec 14 2017 23:48:34 for Compute Library by
<a href="http://www.doxygen.org/index.html">
- <img class="footer" src="doxygen.png" alt="doxygen"/></a> 1.8.6 </li>
+ <img class="footer" src="doxygen.png" alt="doxygen"/></a> 1.8.11 </li>
</ul>
</div>
</body>