Merge "Move performance processing to a util class."
diff --git a/build-python.sh b/build-python.sh
index bf35300..2a811fc 100755
--- a/build-python.sh
+++ b/build-python.sh
@@ -30,6 +30,7 @@
sed -i 's/import "ComponentSpecificationMessage.proto";/import "test\/vts\/proto\/ComponentSpecificationMessage.proto";/g' proto/VtsProfilingMessage.proto
protoc -I=proto --python_out=proto proto/ComponentSpecificationMessage.proto
+protoc -I=proto --python_out=proto proto/TestSchedulingPolicyMessage.proto
protoc -I=proto --python_out=proto proto/VtsReportMessage.proto
protoc -I=proto --python_out=proto proto/VtsWebStatusMessage.proto
diff --git a/doc/testcase_develop_manual/codelab_enable_profiling.md b/doc/testcase_develop_manual/codelab_enable_profiling.md
new file mode 100644
index 0000000..464cdc0
--- /dev/null
+++ b/doc/testcase_develop_manual/codelab_enable_profiling.md
@@ -0,0 +1,122 @@
+# Codelab - Enable Profiling for VTS HIDL HAL Tests
+
+By enable profiling for your VTS HIDL HAL test, you are expected to get:
+
+ * __trace files__ that record each HAL API call happened during the test
+ execution with the passed argument values as well as the return values.
+
+ * __performance profiling data__ of each API call which is also displayed in
+ the VTS dashboard if the dashboard feature is used.
+
+## 1. Add profiler library to VTS
+
+To enable profiling for your HAL testing, we need to add the corresponding
+profiler library in two places:
+
+* [vts_test_lib_hidl_package_list.mk](../../tools/build/tasks/list/vts_test_lib_hidl_package_list.mk).
+
+* [HidlHalTest.push](../../tools/vts-tradefed/res/push_groups/HidlHalTest.push).
+
+The name of the profiling library follow the pattern as:
+`package_name@version-interface-vts-profiler.so`.
+For example, the library name for NFC HAL is `android.hardware.nfc@1.0-INfc-vts.profiler.so`.
+
+## 2. Modify Your VTS Test Case
+
+If you have not already,
+[Codelab for Host-Driven Tests](codelab_host_driven_test.md)
+gives an overview of how to write a VTS test case. This section assumes you have
+completed that codelab and have at least one VTS test case (either host-driven or
+target-side) which you would like to enable profiling.
+
+### 2.1. Target-Side Tests
+
+This subsection describes how to enable profiling for target-side tests.
+
+* Copy an existing test directory
+
+ `$ cd hardware/interfaces/<HAL Name>/<version>/vts/functional/vts/testcases/hal/<HAL Name>/hidl/`
+
+ `$ cp target target_profiling -rf`
+
+ Then rename the test name from <HAL Name>HidlTargetTest to <HAL Name>HidlTargetProfilingTest everywhere.
+
+* Set `enable-profiling` flag
+
+ Add `<option name="enable-profiling" value="true" />` to the corresponding
+`AndroidTest.xml` file under the `target_profiling` directory.
+
+ An [example AndroidTest.xml file](../../../../hardware/interfaces/vibrator/1.0/vts/functional/vts/testcases/hal/vibrator/hidl/target_profiling/AndroidTest.xml)
+looks like:
+
+---
+```
+<configuration description="Config for VTS VIBRATOR HIDL HAL's target-side test cases">
+ <target_preparer class="com.android.compatibility.common.tradefed.targetprep.VtsFilePusher">
+ <option name="push-group" value="HidlHalTest.push" />
+ </target_preparer>
+ <target_preparer class="com.android.tradefed.targetprep.VtsPythonVirtualenvPreparer" />
+ <test class="com.android.tradefed.testtype.VtsMultiDeviceTest">
+ <option name="test-module-name" value="VibratorHidlTargetProfilingTest" />
+ <option name="binary-test-sources" value="
+ _32bit::DATA/nativetest/vibrator_hidl_hal_test/vibrator_hidl_hal_test,
+ _64bit::DATA/nativetest64/vibrator_hidl_hal_test/vibrator_hidl_hal_test,
+ "/>
+ <option name="binary-test-type" value="gtest" />
+ <option name="test-timeout" value="1m" />
+ <option name="enable-profiling" value="true" />
+ </test>
+</configuration>
+```
+---
+
+* Schedule the profiling test
+
+ Add the following line to [vts-serving-staging-hal-hidl-profiling.xml](../../tools/vts-tradefed/res/config/vts-serving-staging-hal-hidl-profiling.xml):
+
+ `<option name="compatibility:include-filter" value="<HAL Name>HidlTargetProfilingTest" />`
+ where `<HAL NAME>` is the name of your HAL.
+
+* Subscribe the notification alert emails
+
+ Please check (notification page)[../web/notification_samples.md] for the detailed instructions.
+
+ Basically, now it is all set so let's wait for a day or so and then visit your VTS Dashboard.
+ At that time, you should be able to add `<HAL Name>HidlTargetProfilingTest` to your favorite list.
+ That is all you need to do in order to subscribe alert emails which will sent if any notably performance degradations are found by your profiling tests.
+ Also if you click `<HAL Name>HidlTargetProfilingTest` in the dashboard main page,
+ the test result page shows up where the top-left side shows the list of APIs which have some measured performance data.
+
+### 2.2. Host-Driven Tests
+
+This subsection describes how to enable profiling for host-driven tests.
+
+* Copy an existing test directory
+
+ `$ cd hardware/interfaces/<HAL Name>/<version>/vts/functional/vts/testcases/hal/<HAL Name>/hidl/`
+
+ `$ cp host host_profiling -rf`
+
+ Then rename the test name from <HAL Name>HidlHostTest to <HAL Name>HidlHostProfilingTest everywhere.
+
+* Update the configs
+
+ First, add `<option name="enable-profiling" value="true" />` to the corresponding
+ `AndroidTest.xml` file (similar as target-side tests).
+
+ Second, add the following code to the `setUpClass` function in your test script
+
+```
+if self.enable_profiling:
+ profiling_utils.EnableVTSProfiling(self.dut.shell.one)
+```
+
+ Also, add the following code to the `tearDownClass` function in your test script
+
+```
+if self.enable_profiling:
+ profiling_trace_path = getattr(
+ self, self.VTS_PROFILING_TRACING_PATH, "")
+ self.ProcessAndUploadTraceData(self.dut, profiling_trace_path)
+ profiling_utils.DisableVTSProfiling(self.dut.shell.one)
+```
diff --git a/doc/testcase_develop_manual/index.md b/doc/testcase_develop_manual/index.md
index c8ad3de..905dbbf 100644
--- a/doc/testcase_develop_manual/index.md
+++ b/doc/testcase_develop_manual/index.md
@@ -23,5 +23,8 @@
* [Codelab for Native Code Coverage Measurement](codelab_native_code_coverage.md)
* [Native Coverage Information and FAQ](native_coverage_faq.md)
-3. FAQs
+3. Enable Profiling for VTS Hidl HAL Tests
+ * [Codelab for Profiling VTS Hidl HAL Tests](codelab_enable_profiling.md)
+
+4. FAQs
* [How to directly run a VTS test without using VTS-TradeFed](run_vts_directly.md)
\ No newline at end of file
diff --git a/doc/web/notification_samples.md b/doc/web/notification_samples.md
index cff8770..fc97131 100644
--- a/doc/web/notification_samples.md
+++ b/doc/web/notification_samples.md
@@ -23,8 +23,6 @@
For details, visit the <a href="#">VTS dashboard.</a>
</div>
-
-
## VTS Performance Digests
Each day, the VTS Dashboard scans tests which have accompanying performance
diff --git a/proto/TestSchedulingPolicyMessage.proto b/proto/TestSchedulingPolicyMessage.proto
new file mode 100644
index 0000000..df99f00
--- /dev/null
+++ b/proto/TestSchedulingPolicyMessage.proto
@@ -0,0 +1,42 @@
+// Copyright 2017 The Android Open Source Project
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+syntax = "proto2";
+
+package android.vts;
+
+// Supported test scheduling modes.
+enum TestSchedulingMode {
+ UKNOWN_TEST_SCHEDULING_MODE_TYPE = 0;
+
+ // to schedule on ToT best effort
+ TEST_SCHEDULING_MODE_TOT_BEST_EFFORT = 1;
+
+ // to schedule once per period
+ TEST_SCHEDULING_MODE_PERIODIC = 2;
+}
+
+// To specify a test scheduling policy.
+message TestSchedulingPolicyMessage {
+ // one or multitple tareget test plans.
+ repeated bytes target_plans = 1;
+ // one or multitple tareget tests. used if plans is not defined.
+ repeated bytes target_tests = 2;
+
+ // test scheduling mode
+ optional TestSchedulingMode scheduling_mode = 101;
+
+ // period in seconds (for TEST_SCHEDULING_MODE_PERIODIC).
+ optional uint32 period_secs = 110;
+}
diff --git a/proto/TestSchedulingPolicyMessage_pb2.py b/proto/TestSchedulingPolicyMessage_pb2.py
new file mode 100644
index 0000000..bf635d2
--- /dev/null
+++ b/proto/TestSchedulingPolicyMessage_pb2.py
@@ -0,0 +1,109 @@
+# Generated by the protocol buffer compiler. DO NOT EDIT!
+# source: TestSchedulingPolicyMessage.proto
+
+from google.protobuf.internal import enum_type_wrapper
+from google.protobuf import descriptor as _descriptor
+from google.protobuf import message as _message
+from google.protobuf import reflection as _reflection
+from google.protobuf import descriptor_pb2
+# @@protoc_insertion_point(imports)
+
+
+
+
+DESCRIPTOR = _descriptor.FileDescriptor(
+ name='TestSchedulingPolicyMessage.proto',
+ package='android.vts',
+ serialized_pb='\n!TestSchedulingPolicyMessage.proto\x12\x0b\x61ndroid.vts\"\x98\x01\n\x1bTestSchedulingPolicyMessage\x12\x14\n\x0ctarget_plans\x18\x01 \x03(\x0c\x12\x14\n\x0ctarget_tests\x18\x02 \x03(\x0c\x12\x38\n\x0fscheduling_mode\x18\x65 \x01(\x0e\x32\x1f.android.vts.TestSchedulingMode\x12\x13\n\x0bperiod_secs\x18n \x01(\r*\x87\x01\n\x12TestSchedulingMode\x12$\n UKNOWN_TEST_SCHEDULING_MODE_TYPE\x10\x00\x12(\n$TEST_SCHEDULING_MODE_TOT_BEST_EFFORT\x10\x01\x12!\n\x1dTEST_SCHEDULING_MODE_PERIODIC\x10\x02')
+
+_TESTSCHEDULINGMODE = _descriptor.EnumDescriptor(
+ name='TestSchedulingMode',
+ full_name='android.vts.TestSchedulingMode',
+ filename=None,
+ file=DESCRIPTOR,
+ values=[
+ _descriptor.EnumValueDescriptor(
+ name='UKNOWN_TEST_SCHEDULING_MODE_TYPE', index=0, number=0,
+ options=None,
+ type=None),
+ _descriptor.EnumValueDescriptor(
+ name='TEST_SCHEDULING_MODE_TOT_BEST_EFFORT', index=1, number=1,
+ options=None,
+ type=None),
+ _descriptor.EnumValueDescriptor(
+ name='TEST_SCHEDULING_MODE_PERIODIC', index=2, number=2,
+ options=None,
+ type=None),
+ ],
+ containing_type=None,
+ options=None,
+ serialized_start=206,
+ serialized_end=341,
+)
+
+TestSchedulingMode = enum_type_wrapper.EnumTypeWrapper(_TESTSCHEDULINGMODE)
+UKNOWN_TEST_SCHEDULING_MODE_TYPE = 0
+TEST_SCHEDULING_MODE_TOT_BEST_EFFORT = 1
+TEST_SCHEDULING_MODE_PERIODIC = 2
+
+
+
+_TESTSCHEDULINGPOLICYMESSAGE = _descriptor.Descriptor(
+ name='TestSchedulingPolicyMessage',
+ full_name='android.vts.TestSchedulingPolicyMessage',
+ filename=None,
+ file=DESCRIPTOR,
+ containing_type=None,
+ fields=[
+ _descriptor.FieldDescriptor(
+ name='target_plans', full_name='android.vts.TestSchedulingPolicyMessage.target_plans', index=0,
+ number=1, type=12, cpp_type=9, label=3,
+ has_default_value=False, default_value=[],
+ message_type=None, enum_type=None, containing_type=None,
+ is_extension=False, extension_scope=None,
+ options=None),
+ _descriptor.FieldDescriptor(
+ name='target_tests', full_name='android.vts.TestSchedulingPolicyMessage.target_tests', index=1,
+ number=2, type=12, cpp_type=9, label=3,
+ has_default_value=False, default_value=[],
+ message_type=None, enum_type=None, containing_type=None,
+ is_extension=False, extension_scope=None,
+ options=None),
+ _descriptor.FieldDescriptor(
+ name='scheduling_mode', full_name='android.vts.TestSchedulingPolicyMessage.scheduling_mode', index=2,
+ number=101, type=14, cpp_type=8, label=1,
+ has_default_value=False, default_value=0,
+ message_type=None, enum_type=None, containing_type=None,
+ is_extension=False, extension_scope=None,
+ options=None),
+ _descriptor.FieldDescriptor(
+ name='period_secs', full_name='android.vts.TestSchedulingPolicyMessage.period_secs', index=3,
+ number=110, type=13, cpp_type=3, label=1,
+ has_default_value=False, default_value=0,
+ message_type=None, enum_type=None, containing_type=None,
+ is_extension=False, extension_scope=None,
+ options=None),
+ ],
+ extensions=[
+ ],
+ nested_types=[],
+ enum_types=[
+ ],
+ options=None,
+ is_extendable=False,
+ extension_ranges=[],
+ serialized_start=51,
+ serialized_end=203,
+)
+
+_TESTSCHEDULINGPOLICYMESSAGE.fields_by_name['scheduling_mode'].enum_type = _TESTSCHEDULINGMODE
+DESCRIPTOR.message_types_by_name['TestSchedulingPolicyMessage'] = _TESTSCHEDULINGPOLICYMESSAGE
+
+class TestSchedulingPolicyMessage(_message.Message):
+ __metaclass__ = _reflection.GeneratedProtocolMessageType
+ DESCRIPTOR = _TESTSCHEDULINGPOLICYMESSAGE
+
+ # @@protoc_insertion_point(class_scope:android.vts.TestSchedulingPolicyMessage)
+
+
+# @@protoc_insertion_point(module_scope)
diff --git a/runners/host/base_test_with_webdb.py b/runners/host/base_test_with_webdb.py
index 1c92ee2..be24c85 100644
--- a/runners/host/base_test_with_webdb.py
+++ b/runners/host/base_test_with_webdb.py
@@ -73,6 +73,7 @@
SERVICE_JSON_PATH = "service_key_json_path"
COVERAGE_ZIP = "coverage_zip"
REVISION_DICT = "revision_dict"
+ CHECKSUM_GCNO_DICT = "checksum_gcno_dict"
STATUS_TABLE = "vts_status_table"
BIGTABLE_BASE_URL = "bigtable_base_url"
BRANCH = "master"
@@ -401,6 +402,15 @@
self._profiling[name].end_timestamp = value
return True
+ def IsCoverageConfigSpecified(self):
+ """Determines if the config file specifies modules for coverage.
+
+ Returns:
+ True if the config file specifies modules for coverage measurement,
+ False otherwise
+ """
+ return hasattr(self, self.MODULES)
+
def InitializeCoverage(self):
"""Initializes the test for coverage instrumentation.
@@ -470,6 +480,11 @@
(self.BRANCH, build_flavor, device_build_id, product
))
return False
+
+ if not self.IsCoverageConfigSpecified():
+ checksum_gcno_dict = coverage_utils.GetChecksumGcnoDict(cov_zip)
+ setattr(self, self.CHECKSUM_GCNO_DICT, checksum_gcno_dict)
+
setattr(self, self.COVERAGE_ZIP, cov_zip)
setattr(self, self.REVISION_DICT, revision_dict)
setattr(self, self.COVERAGE, True)
@@ -515,14 +530,23 @@
try:
cov_zip = getattr(self, self.COVERAGE_ZIP)
- modules = getattr(self, self.MODULES)
revision_dict = getattr(self, self.REVISION_DICT)
except AttributeError as e:
logging.error("attributes not found %s", str(e))
return False
- coverage_utils.ProcessCoverageData(report_msg, cov_zip, modules,
- gcda_dict, revision_dict)
+ if not self.IsCoverageConfigSpecified():
+ # auto-process coverage data
+ checksum_gcno_dict = getattr(self, self.CHECKSUM_GCNO_DICT)
+ coverage_utils.ProcessCoverageData(
+ report_msg, gcda_dict, revision_dict,
+ checksum_gcno_dict=checksum_gcno_dict)
+ else:
+ # explicitly process coverage data for the specified modules
+ modules = getattr(self, self.MODULES)
+ coverage_utils.ProcessCoverageData(
+ report_msg, gcda_dict, revision_dict, modules=modules,
+ cov_zip=cov_zip)
return True
def ProcessAndUploadTraceData(self, dut, profiling_trace_path):
diff --git a/sysfuzzer/common/utils/InterfaceSpecUtil.cpp b/sysfuzzer/common/utils/InterfaceSpecUtil.cpp
index 50357c4..1f24391 100644
--- a/sysfuzzer/common/utils/InterfaceSpecUtil.cpp
+++ b/sysfuzzer/common/utils/InterfaceSpecUtil.cpp
@@ -38,8 +38,9 @@
string package_as_function_name(message.package());
ReplaceSubString(package_as_function_name, ".", "_");
prefix_ss << VTS_INTERFACE_SPECIFICATION_FUNCTION_NAME_PREFIX
- << message.component_class() << "_" << package_as_function_name
- << "_" << int(message.component_type_version()) << "_";
+ << message.component_class() << "_" << package_as_function_name << "_"
+ << int(message.component_type_version()) << "_"
+ << message.component_name() << "_";
}
return prefix_ss.str();
}
diff --git a/sysfuzzer/vtscompiler/VtsCompilerUtils.cpp b/sysfuzzer/vtscompiler/VtsCompilerUtils.cpp
index e7b62c3..1b2876f 100644
--- a/sysfuzzer/vtscompiler/VtsCompilerUtils.cpp
+++ b/sysfuzzer/vtscompiler/VtsCompilerUtils.cpp
@@ -199,8 +199,8 @@
exit(-1);
}
} else if (arg.vector_value(0).type() == TYPE_STRING) {
- return "const ::android::hardware::hidl_vec< "
- "::android::hardware::hidl_string> &";
+ return "::android::hardware::hidl_vec< "
+ "::android::hardware::hidl_string>";
} else {
cerr << __func__ << ":" << __LINE__ << " ERROR unsupported type "
<< arg.vector_value(0).type() << endl;
@@ -209,6 +209,8 @@
return "sp<" + arg.predefined_type() + ">";
} else if (arg.type() == TYPE_HANDLE) {
return "::android::hardware::hidl_handle";
+ } else if (arg.type() == TYPE_HIDL_INTERFACE) {
+ return "sp<" + arg.predefined_type() + ">";
}
cerr << __func__ << ":" << __LINE__ << " "
<< ": type " << arg.type() << " not supported" << endl;
diff --git a/sysfuzzer/vtscompiler/code_gen/driver/HalHidlCodeGen.cpp b/sysfuzzer/vtscompiler/code_gen/driver/HalHidlCodeGen.cpp
index 573c9d4..a42b460 100644
--- a/sysfuzzer/vtscompiler/code_gen/driver/HalHidlCodeGen.cpp
+++ b/sysfuzzer/vtscompiler/code_gen/driver/HalHidlCodeGen.cpp
@@ -401,6 +401,10 @@
} else if (arg.type() == TYPE_VECTOR) {
out << GetCppVariableType(arg, &message) << " ";
out << "arg" << arg_count << ";" << "\n";
+ } else if (arg.type() == TYPE_HIDL_INTERFACE) {
+ out << "/* TYPE_HIDL_INTERFACE not supported yet */\n";
+ out << GetCppVariableType(arg, &message) << " ";
+ out << "arg" << arg_count;
} else {
out << GetCppVariableType(arg, &message) << " ";
out << "arg" << arg_count << " = ";
@@ -408,7 +412,8 @@
if (arg.type() != TYPE_VECTOR &&
arg.type() != TYPE_HIDL_CALLBACK &&
- arg.type() != TYPE_STRUCT) {
+ arg.type() != TYPE_STRUCT &&
+ arg.type() != TYPE_HIDL_INTERFACE) {
std::stringstream msg_ss;
msg_ss << "func_msg->arg(" << arg_count << ")";
string msg = msg_ss.str();
@@ -490,6 +495,9 @@
} else if (arg.vector_value(0).type() == TYPE_STRUCT) {
out << "/* arg" << arg_count << "buffer[vector_index] not initialized "
<< "since TYPE_STRUCT not yet supported */" << "\n";
+ } else if (arg.vector_value(0).type() == TYPE_STRING) {
+ out << "/* arg" << arg_count << "buffer[vector_index] not initialized "
+ << "since TYPE_STRING not yet supported */" << "\n";
} else {
cerr << __func__ << ":" << __LINE__ << " ERROR unsupported type "
<< arg.vector_value(0).type() << "\n";
@@ -497,10 +505,12 @@
}
out.unindent();
out << "}" << "\n";
- out << "arg" << arg_count << ".setToExternal("
- << "arg" << arg_count << "buffer, "
- << "func_msg->arg(" << arg_count << ").vector_size()"
- << ")";
+ if (arg.vector_value(0).type() == TYPE_SCALAR
+ || arg.vector_value(0).type() == TYPE_ENUM) {
+ out << "arg" << arg_count << ".setToExternal(" << "arg"
+ << arg_count << "buffer, " << "func_msg->arg(" << arg_count
+ << ").vector_size()" << ")";
+ }
}
out << ";" << "\n";
if (arg.type() == TYPE_STRUCT) {
diff --git a/sysfuzzer/vtscompiler/test/golden/DRIVER/Nfc.driver.cpp b/sysfuzzer/vtscompiler/test/golden/DRIVER/Nfc.driver.cpp
index a45d5ec..e595b88 100644
--- a/sysfuzzer/vtscompiler/test/golden/DRIVER/Nfc.driver.cpp
+++ b/sysfuzzer/vtscompiler/test/golden/DRIVER/Nfc.driver.cpp
@@ -353,7 +353,7 @@
}
extern "C" {
-android::vts::FuzzerBase* vts_func_4_android_hardware_nfc_1_() {
+android::vts::FuzzerBase* vts_func_4_android_hardware_nfc_1_INfc_() {
return (android::vts::FuzzerBase*) new android::vts::FuzzerExtended_INfc();
}
diff --git a/sysfuzzer/vtscompiler/test/golden/DRIVER/hardware/interfaces/nfc/1.0/vts/Nfc.vts.h b/sysfuzzer/vtscompiler/test/golden/DRIVER/hardware/interfaces/nfc/1.0/vts/Nfc.vts.h
index 4efb50c..b4285ba 100644
--- a/sysfuzzer/vtscompiler/test/golden/DRIVER/hardware/interfaces/nfc/1.0/vts/Nfc.vts.h
+++ b/sysfuzzer/vtscompiler/test/golden/DRIVER/hardware/interfaces/nfc/1.0/vts/Nfc.vts.h
@@ -37,7 +37,7 @@
extern "C" {
-extern android::vts::FuzzerBase* vts_func_4_android_hardware_nfc_1_();
+extern android::vts::FuzzerBase* vts_func_4_android_hardware_nfc_1_INfc_();
}
} // namespace vts
} // namespace android
diff --git a/tools/vts-tradefed/res/config/vts-serving-staging-coverage.xml b/tools/vts-tradefed/res/config/vts-serving-staging-coverage.xml
index 14c9d15..0eee115 100644
--- a/tools/vts-tradefed/res/config/vts-serving-staging-coverage.xml
+++ b/tools/vts-tradefed/res/config/vts-serving-staging-coverage.xml
@@ -28,5 +28,6 @@
<option name="compatibility:include-filter" value="LightHidlTargetTest" />
<option name="compatibility:include-filter" value="VrHidlTargetTest" />
<option name="compatibility:include-filter" value="HalMemtrackHidlTargetTest" />
+ <option name="compatibility:include-filter" value="HalSoundTriggerHidlTargetBasicTest" />
<template-include name="reporters" default="basic-reporters" />
</configuration>
diff --git a/tools/vts-tradefed/res/config/vts-serving-staging-hal-hidl-profiling.xml b/tools/vts-tradefed/res/config/vts-serving-staging-hal-hidl-profiling.xml
new file mode 100644
index 0000000..4af060d
--- /dev/null
+++ b/tools/vts-tradefed/res/config/vts-serving-staging-hal-hidl-profiling.xml
@@ -0,0 +1,24 @@
+<?xml version="1.0" encoding="utf-8"?>
+<!-- Copyright (C) 2016 Google Inc.
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<configuration description="Run New VTS Profiling Tests for HIDL HALs in the Staging environment">
+ <include name="everything" />
+ <option name="plan" value="vts" />
+ <option name="compatibility:include-filter" value="VibratorHidlProfilingTest" />
+ <option name="compatibility:include-filter" value="VibratorHidlTargetProfilingTest" />
+ <option name="compatibility:include-filter" value="NfcHidlTargetProfilingTest" />
+ <option name="compatibility:include-filter" value="ThermalHidlTargetProfilingTest" />
+ <template-include name="reporters" default="basic-reporters" />
+</configuration>
diff --git a/tools/vts-tradefed/res/config/vts-serving-staging-hal-hidl.xml b/tools/vts-tradefed/res/config/vts-serving-staging-hal-hidl.xml
index baf9993..d1dd059 100644
--- a/tools/vts-tradefed/res/config/vts-serving-staging-hal-hidl.xml
+++ b/tools/vts-tradefed/res/config/vts-serving-staging-hal-hidl.xml
@@ -27,8 +27,6 @@
<option name="compatibility:include-filter" value="HalNfcHidlTargetBasicTest" />
<option name="compatibility:include-filter" value="VibratorHidlTargetTest" />
<option name="compatibility:include-filter" value="VibratorHidlTest" />
- <option name="compatibility:include-filter" value="VibratorHidlProfilingTest" />
- <option name="compatibility:include-filter" value="VibratorHidlTargetProfilingTest" />
<option name="compatibility:include-filter" value="HalAudioEffectHidlTargetBasicTest" />
<option name="compatibility:include-filter" value="ThermalHidlTargetTest" />
<option name="compatibility:include-filter" value="SensorsHidlTest" /><!-- need runtime stopped -->
@@ -39,6 +37,7 @@
<option name="compatibility:include-filter" value="VrHidlTest" />
<option name="compatibility:include-filter" value="VrHidlTargetTest" />
<option name="compatibility:include-filter" value="HalMemtrackHidlTargetTest" />
+ <option name="compatibility:include-filter" value="HalSoundTriggerHidlTargetBasicTest" />
<!-- option name="compatibility:include-filter" value="HalGraphicsAllocatorHidlTargetTest" / --><!-- need driver support -->
<!-- option name="compatibility:include-filter" value="HalGraphicsMapperHidlTargetTest" / --><!-- need driver support -->
<template-include name="reporters" default="basic-reporters" />
diff --git a/utils/python/coverage/coverage_utils.py b/utils/python/coverage/coverage_utils.py
index fc426e0..9003a53 100644
--- a/utils/python/coverage/coverage_utils.py
+++ b/utils/python/coverage/coverage_utils.py
@@ -28,6 +28,8 @@
TARGET_COVERAGE_PATH = "/data/local/tmp/"
LOCAL_COVERAGE_PATH = "/tmp/vts-test-coverage"
+GCNO_SUFFIX = ".gcno"
+GCDA_SUFFIX = ".gcda"
COVERAGE_SUFFIX = ".gcnodir"
GIT_PROJECT = "git_project"
MODULE_NAME = "module_name"
@@ -84,13 +86,188 @@
gcda_dict[basename] = gcda_content
return gcda_dict
-def ProcessCoverageData(report_msg, cov_zip, modules, gcda_dict, revision_dict):
+def GetChecksumGcnoDict(cov_zip):
+ """Generates a dictionary from gcno checksum to GCNOParser object.
+
+ Processes the gcnodir files in the zip file to produce a mapping from gcno
+ checksum to the GCNOParser object wrapping the gcno content.
+
+ Args:
+ cov_zip: the zip file containing gcnodir files from the device build
+
+ Returns:
+ the dictionary of gcno checksums to GCNOParser objects
+ """
+ checksum_gcno_dict = dict()
+ fnames = cov_zip.namelist()
+ instrumented_modules = [f for f in fnames if f.endswith(COVERAGE_SUFFIX)]
+ for instrumented_module in instrumented_modules:
+ # Read the gcnodir file
+ archive = archive_parser.Archive(cov_zip.open(instrumented_module).read())
+ try:
+ archive.Parse()
+ except ValueError:
+ logging.error("Archive could not be parsed: %s", name)
+ continue
+
+ for gcno_file_path in archive.files:
+ file_name_path = gcno_file_path.rsplit(".", 1)[0]
+ file_name = os.path.basename(file_name_path)
+ gcno_stream = io.BytesIO(archive.files[gcno_file_path])
+ gcno_file_parser = gcno_parser.GCNOParser(gcno_stream)
+ checksum_gcno_dict[gcno_file_parser.checksum] = gcno_file_parser
+ return checksum_gcno_dict
+
+def ExtractSourceName(gcno_summary, file_name):
+ """Gets the source name from the GCNO summary object.
+
+ Gets the original source file name from the FileSummary object describing
+ a gcno file using the base filename of the gcno/gcda file.
+
+ Args:
+ gcno_summary: a FileSummary object describing a gcno file
+ file_name: the base filename (without extensions) of the gcno or gcda file
+
+ Returns:
+ The relative path to the original source file corresponding to the
+ provided gcno summary. The path is relative to the root of the build.
+ """
+ src_file_path = None
+ for key in gcno_summary.functions:
+ src_file_path = gcno_summary.functions[key].src_file_name
+ src_parts = src_file_path.rsplit(".", 1)
+ src_file_name = src_parts[0]
+ src_extension = src_parts[1]
+ if src_extension not in ["c", "cpp", "cc"]:
+ logging.warn("Found unsupported file type: %s", src_file_path)
+ continue
+ if src_file_name.endswith(file_name):
+ logging.info("Coverage source file: %s", src_file_path)
+ break
+ return src_file_path
+
+def AddCoverageReport(report_msg, src_file_path, gcno_summary,
+ git_project_name, git_project_path, revision):
+ """Adds a coverage report to the VtsReportMessage.
+
+ Processes the source information, git project information, and processed
+ coverage information and stores it into a CoverageReportMessage within the
+ report message.
+
+ Args:
+ report_msg: a TestReportMessage or TestCaseReportMessage object
+ src_file_path: the path to the original source file
+ gcno_summary: a FileSummary object describing a gcno file
+ git_project_name: the name of the git project containing the source
+ git_project_path: the path from the root to the git project
+ revision: the commit hash identifying the source code that was used to
+ build a device image
+ """
+ coverage_vec = coverage_report.GenerateLineCoverageVector(
+ src_file_path, gcno_summary)
+ coverage = report_msg.coverage.add()
+ coverage.total_line_count, coverage.covered_line_count = (
+ coverage_report.GetCoverageStats(coverage_vec))
+ coverage.line_coverage_vector.extend(coverage_vec)
+
+ src_file_path = os.path.relpath(src_file_path, git_project_path)
+ coverage.file_path = src_file_path
+ coverage.revision = revision
+ coverage.project_name = git_project_name
+
+def AutoProcess(report_msg, checksum_gcno_dict, gcda_dict, revision_dict):
+ """Process coverage data and appends coverage reports to the report message.
+
+ Matches gcno files with gcda files and processes them into a coverage report
+ with references to the original source code used to build the system image.
+ Coverage information is appended as a CoverageReportMessage to the provided
+ report message.
+
+ Git project information is automatically extracted from the build info and
+ the source file name enclosed in each gcno file. Git project names must
+ resemble paths and may differ from the paths to their project root by at
+ most one. If no match is found, then coverage information will not be
+ be processed.
+
+ e.g. if the project path is test/vts, then its project name may be
+ test/vts or <some folder>/test/vts in order to be recognized.
+
+ Args:
+ report_msg: a TestReportMessage or TestCaseReportMessage object
+ checksum_gcno_dict: the dictionary of gcno checksums to
+ gcno content (binary string)
+ gcda_dict: the dictionary of gcda basenames to gcda content (binary string)
+ revision_dict: the dictionary with project names as keys and revision ID
+ strings as values.
+ """
+ for gcda_name in gcda_dict:
+ gcda_stream = io.BytesIO(gcda_dict[gcda_name])
+ gcda_file_parser = gcda_parser.GCDAParser(gcda_stream)
+
+ if not gcda_file_parser.checksum in checksum_gcno_dict:
+ logging.info("No matching gcno file for gcda: %s", gcda_name)
+ continue
+ gcno_file_parser = checksum_gcno_dict[gcda_file_parser.checksum]
+
+ try:
+ gcno_summary = gcno_file_parser.Parse()
+ except FileFormatError:
+ logging.error("Error parsing gcno file %s", gcno_file_path)
+ continue
+
+ file_name = gcda_name.rsplit(".", 1)[0]
+ src_file_path = ExtractSourceName(gcno_summary, file_name)
+
+ if not src_file_path:
+ logging.error("No source file found for %s.", gcno_file_path)
+ continue
+
+ # Process and merge gcno/gcda data
+ try:
+ gcda_file_parser.Parse(gcno_summary)
+ except FileFormatError:
+ logging.error("Error parsing gcda file %s", gcda_name)
+ continue
+
+ # Get the git project information
+ # Assumes that the project name and path to the project root are similar
+ revision = None
+ for project_name in revision_dict:
+ # Matches cases when source file root and project name are the same
+ if src_file_path.startswith(str(project_name)):
+ git_project_name = str(project_name)
+ git_project_path = str(project_name)
+ revision = str(revision_dict[project_name])
+ logging.info("Source file '%s' matched with project '%s'",
+ src_file_path, git_project_name)
+ break
+
+ parts = os.path.normpath(str(project_name)).split(os.sep, 1)
+ # Matches when project name has an additional prefix before the
+ # project path root.
+ if len(parts) > 1 and src_file_path.startswith(parts[-1]):
+ git_project_name = str(project_name)
+ git_project_path = parts[-1]
+ revision = str(revision_dict[project_name])
+ logging.info("Source file '%s' matched with project '%s'",
+ src_file_path, git_project_name)
+
+ if not revision:
+ logging.info("Could not find git info for %s", src_file_path)
+ continue
+
+ AddCoverageReport(report_msg, src_file_path, gcno_summary,
+ git_project_name, git_project_path, revision)
+
+
+def ManualProcess(report_msg, cov_zip, modules, gcda_dict, revision_dict):
"""Process coverage data and appends coverage reports to the report message.
Opens the gcno files in the cov_zip for the specified modules and matches
gcno/gcda files. Then, coverage vectors are generated for each set of matching
gcno/gcda files and appended as a CoverageReportMessage to the provided
- report message.
+ report message. Unlike AutoProcess, coverage information is only processed
+ for the modules explicitly defined in 'modules'.
Args:
report_msg: a TestReportMessage or TestCaseReportMessage object
@@ -128,7 +305,7 @@
try:
archive.Parse()
except ValueError:
- logging.error("Archive could not be parsed: %s" % name)
+ logging.error("Archive could not be parsed: %s", name)
continue
for gcno_file_path in archive.files:
@@ -144,46 +321,59 @@
src_file_path = None
# Match gcno file with gcda file
- gcda_name = file_name + ".gcda"
+ gcda_name = file_name + GCDA_SUFFIX
if gcda_name not in gcda_dict:
- logging.error("No gcda file found %s." % gcda_name)
+ logging.error("No gcda file found %s.", gcda_name)
continue
- gcda_content = gcda_dict[gcda_name]
- # Match gcno file with source files
- for key in gcno_summary.functions:
- src_file_path = gcno_summary.functions[key].src_file_name
- src_parts = src_file_path.rsplit(".", 1)
- src_file_name = src_parts[0]
- src_extension = src_parts[1]
- if src_extension not in ["c", "cpp", "cc"]:
- logging.warn("Found unsupported file type: %s" %
- src_file_path)
- continue
- if src_file_name.endswith(file_name):
- logging.info("Coverage source file: %s" %
- src_file_path)
- break
+ src_file_path = ExtractSourceName(gcno_summary, file_name)
if not src_file_path:
- logging.error("No source file found for %s." %
- gcno_file_path)
+ logging.error("No source file found for %s.", gcno_file_path)
continue
# Process and merge gcno/gcda data
+ gcda_content = gcda_dict[gcda_name]
gcda_stream = io.BytesIO(gcda_content)
try:
- gcda_parser.GCDAParser(gcda_stream, gcno_summary).Parse()
+ gcda_parser.GCDAParser(gcda_stream).Parse(gcno_summary)
except FileFormatError:
logging.error("Error parsing gcda file %s", gcda_content)
continue
- coverage_vec = coverage_report.GenerateLineCoverageVector(
- src_file_path, gcno_summary)
- coverage = report_msg.coverage.add()
- coverage.total_line_count, coverage.covered_line_count = (
- coverage_report.GetCoverageStats(coverage_vec))
- coverage.line_coverage_vector.extend(coverage_vec)
- src_file_path = os.path.relpath(src_file_path, git_project_path)
- coverage.file_path = src_file_path
- coverage.revision = revision
- coverage.project_name = git_project
+
+ AddCoverageReport(report_msg, src_file_path, gcno_summary,
+ git_project, git_project_path, revision)
+
+
+def ProcessCoverageData(report_msg, gcda_dict, revision_dict,
+ checksum_gcno_dict=None, cov_zip=None, modules=None):
+ """Process coverage data and appends coverage reports to the report message.
+
+ Calls AutoProcess or ManualProcess depending on the provided inputs. If
+ checksum_gcno_dict is provided with a non-None value, then AutoProcess is
+ used to generate CoverageReportMessage objects appended to report_msg. If
+ both cov_zip and modules are provided with non-None values, then coverage
+ information is processed explicitly based on the requested modules using
+ ManualProcess.
+
+ Args:
+ report_msg: a TestReportMessage or TestCaseReportMessage object
+ gcda_dict: the dictionary of gcda basenames to gcda content (binary string)
+ revision_dict: the dictionary with project names as keys and revision ID
+ strings as values.
+ checksum_gcno_dict: the dictionary of gcno checksums to
+ gcno content (binary string)
+ cov_zip: the zip file containing gcnodir files from the device build
+ modules: the list of module names for which to enable coverage
+
+ Returns:
+ True if the coverage data is processed successfully, False otherwise.
+ """
+ if checksum_gcno_dict:
+ AutoProcess(report_msg, checksum_gcno_dict, gcda_dict, revision_dict)
+ elif cov_zip and modules:
+ ManualProcess(report_msg, cov_zip, modules, gcda_dict, revision_dict)
+ else:
+ logging.error("ProcessCoverageData: not enough arguments")
+ return False
+ return True
diff --git a/utils/python/coverage/gcda_parser.py b/utils/python/coverage/gcda_parser.py
index 97a1e23..af69a04 100644
--- a/utils/python/coverage/gcda_parser.py
+++ b/utils/python/coverage/gcda_parser.py
@@ -49,7 +49,7 @@
TAG_OBJECT = 0xa1000000
TAG_PROGRAM = 0xa3000000
- def __init__(self, stream, file_summary):
+ def __init__(self, stream):
"""Inits the parser with the input stream and default values.
The byte order is set by default to little endian and the summary file
@@ -57,12 +57,29 @@
Args:
stream: An input binary file stream to a .gcno file
- file_summary: The summary from a parsed gcno file
"""
+ self._file_summary = None
super(GCDAParser, self).__init__(stream, self.MAGIC)
- self.file_summary = file_summary
- def Parse(self):
+ @property
+ def file_summary(self):
+ """Gets the FileSummary object where coverage data is stored.
+
+ Returns:
+ A FileSummary object.
+ """
+ return self._file_summary
+
+ @file_summary.setter
+ def file_summary(self, file_summary)
+ """Sets the FileSummary object in which to store coverage data.
+
+ Args:
+ file_summary: A FileSummary object from a processed gcno file
+ """
+ self._file_summary = file_summary
+
+ def Parse(self, file_summary):
"""Runs the parser on the file opened in the stream attribute.
Reads coverage information from the GCDA file stream and resolves
@@ -75,6 +92,7 @@
Raises:
parser.FileFormatError: invalid file format or invalid counts.
"""
+ self.file_summary = file_summary
func = None
while True:
@@ -163,4 +181,4 @@
"""
with open(file_name, 'rb') as stream:
- return GCDAParser(stream, file_summary).Parse()
+ return GCDAParser(stream).Parse(file_summary)
diff --git a/utils/python/coverage/gcda_parser_test.py b/utils/python/coverage/gcda_parser_test.py
index af6fe9a..1b6b53e 100644
--- a/utils/python/coverage/gcda_parser_test.py
+++ b/utils/python/coverage/gcda_parser_test.py
@@ -48,7 +48,8 @@
self.stream = MockStream.concat_int(self.stream, 0)
self.stream = MockStream.concat_string(self.stream, 'test')
length = 5
- parser = gcda_parser.GCDAParser(self.stream, fs)
+ parser = gcda_parser.GCDAParser(self.stream)
+ parser.file_summary = fs
func = parser.ReadFunction(5)
assert (func.ident == ident)
@@ -68,7 +69,8 @@
blocks[0].exit_arcs.append(arc)
blocks[i].entry_arcs.append(arc)
self.stream = MockStream.concat_int64(self.stream, i)
- parser = gcda_parser.GCDAParser(self.stream, fs)
+ parser = gcda_parser.GCDAParser(self.stream)
+ parser.file_summary = fs
parser.ReadCounts(func)
for i, arc in zip(range(1, n), blocks[0].exit_arcs):
self.assertEqual(i, arc.count)
@@ -103,7 +105,8 @@
blocks[i].entry_arcs.append(arc)
self.stream = MockStream.concat_int64(self.stream, i)
- parser = gcda_parser.GCDAParser(self.stream, fs)
+ parser = gcda_parser.GCDAParser(self.stream)
+ parser.file_summary = fs
parser.ReadCounts(func)
self.assertFalse(blocks[0].exit_arcs[0].resolved)
self.assertFalse(blocks[0].exit_arcs[1].resolved)
diff --git a/utils/python/coverage/parser.py b/utils/python/coverage/parser.py
index 89090ce..4cd9198 100644
--- a/utils/python/coverage/parser.py
+++ b/utils/python/coverage/parser.py
@@ -38,6 +38,7 @@
Attributes:
stream: File stream object for a GCNO file
format: Character denoting the endianness of the file
+ checksum: The checksum (int) of the file
"""
def __init__(self, stream, magic):
@@ -56,7 +57,7 @@
tag = self.ReadInt()
self.version = ''.join(
struct.unpack(self.format + 'ssss', self.stream.read(4)))
- self.ReadInt() # stamp
+ self.checksum = self.ReadInt()
if tag != magic:
tag = struct.unpack('>I', struct.pack('<I', tag))[0]
diff --git a/utils/python/mirror/mirror_object_for_types.py b/utils/python/mirror/mirror_object_for_types.py
index 1f9f2dd..6932c4e 100644
--- a/utils/python/mirror/mirror_object_for_types.py
+++ b/utils/python/mirror/mirror_object_for_types.py
@@ -19,10 +19,12 @@
import logging
import random
-from vts.utils.python.fuzzer import FuzzerUtils
-from vts.proto import ComponentSpecificationMessage_pb2 as CompSpecMsg
from google.protobuf import text_format
+from vts.proto import ComponentSpecificationMessage_pb2 as CompSpecMsg
+from vts.utils.python.fuzzer import FuzzerUtils
+from vts.utils.python.mirror import py2pb
+
class MirrorObjectError(Exception):
"""Raised when there is a general error in manipulating a mirror object."""
@@ -82,6 +84,37 @@
# SpecificationMessage.
return None
+ def GetAttributeSpecification(self, attribute_name):
+ """Returns the ProtoBuf specification of a requested attribute.
+
+ Args:
+ attribute_name: string, the name of a target attribute
+ (optionally excluding the namespace).
+
+ Returns:
+ VariableSpecificationMessage if found, None otherwise
+ """
+ for attribute in self._if_spec_msg.attribute:
+ if (attribute.name == attribute_name or
+ attribute.name.endswith("::" + attribute_name)):
+ return attribute
+ return None
+
+ def Py2Pb(self, attribute_name, py_values):
+ """Returns the ProtoBuf of a give Python values.
+
+ Args:
+ attribute_name: string, the name of a target attribute.
+ py_values: Python values.
+
+ Returns:
+ Converted VariableSpecificationMessage if found, None otherwise
+ """
+ attribute_msg = self.GetAttributeSpecification(attribute_name)
+ if attribute_msg:
+ return py2pb.Convert(attribute_msg, py_values)
+ return None
+
def GetConstType(self, type_name):
"""Returns the Argument Specification Message.
diff --git a/utils/python/mirror/py2pb.py b/utils/python/mirror/py2pb.py
new file mode 100644
index 0000000..ee25c87
--- /dev/null
+++ b/utils/python/mirror/py2pb.py
@@ -0,0 +1,149 @@
+#!/usr/bin/env python
+#
+# Copyright (C) 2016 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+from vts.proto import ComponentSpecificationMessage_pb2 as CompSpecMsg
+
+
+def PyValue2PbEnum(message, pb_spec, py_value):
+ """Converts Python value to VTS VariableSecificationMessage (Enum).
+
+ Args:
+ message: VariableSpecificationMessage is the current and result
+ value message.
+ pb_spec: VariableSpecificationMessage which captures the
+ specification of a target attribute.
+ py_value: Python value provided by a test case.
+
+ Returns:
+ Converted VariableSpecificationMessage if found, None otherwise
+ """
+ if pb_spec.name:
+ message.name = pb_spec.name
+ message.type = CompSpecMsg.TYPE_ENUM
+ # TODO(yim): derive the type by looking up its predefined_type.
+ setattr(message.scalar_value, "int32_t", py_value)
+
+
+def PyValue2PbScalar(message, pb_spec, py_value):
+ """Converts Python value to VTS VariableSecificationMessage (Scalar).
+
+ Args:
+ message: VariableSpecificationMessage is the current and result
+ value message.
+ pb_spec: VariableSpecificationMessage which captures the
+ specification of a target attribute.
+ py_value: Python value provided by a test case.
+
+ Returns:
+ Converted VariableSpecificationMessage if found, None otherwise
+ """
+ if pb_spec.name:
+ message.name = pb_spec.name
+ message.type = CompSpecMsg.TYPE_SCALAR
+ message.scalar_type = pb_spec.scalar_type
+ setattr(message.scalar_value, pb_spec.scalar_type, py_value)
+
+
+def PyList2PbVector(message, pb_spec, py_value):
+ """Converts Python list value to VTS VariableSecificationMessage (Vector).
+
+ Args:
+ message: VariableSpecificationMessage is the current and result
+ value message.
+ pb_spec: VariableSpecificationMessage which captures the
+ specification of a target attribute.
+ py_value: Python value provided by a test case.
+
+ Returns:
+ Converted VariableSpecificationMessage if found, None otherwise
+ """
+ if pb_spec.name:
+ message.name = pb_spec.name
+ message.type = CompSpecMsg.TYPE_VECTOR
+ vector_spec = pb_spec.vector_value[0]
+ for curr_value in py_value:
+ new_vector_message = message.vector_value.add()
+ if vector_spec.type == CompSpecMsg.TYPE_SCALAR:
+ PyValue2PbScalar(new_vector_message, vector_spec, curr_value)
+ else:
+ logging.error("unsupported type %s", message.type)
+ exit(0)
+
+
+def PyDict2PbStruct(message, pb_spec, py_value):
+ """Converts Python dict to VTS VariableSecificationMessage (struct).
+
+ Args:
+ pb_spec: VariableSpecificationMessage which captures the
+ specification of a target attribute.
+ py_value: Python value provided by a test case.
+
+ Returns:
+ Converted VariableSpecificationMessage if found, None otherwise
+ """
+ if pb_spec.name:
+ message.name = pb_spec.name
+ message.type = CompSpecMsg.TYPE_STRUCT
+ for attr in pb_spec.struct_value:
+ if attr.name in py_value:
+ curr_value = py_value[attr.name]
+ attr_msg = message.struct_value.add()
+ if attr.type == CompSpecMsg.TYPE_ENUM:
+ PyValue2PbEnum(attr_msg, attr, curr_value)
+ elif attr.type == CompSpecMsg.TYPE_SCALAR:
+ PyValue2PbScalar(attr_msg, attr, curr_value)
+ elif attr.type == CompSpecMsg.TYPE_VECTOR:
+ PyList2PbVector(attr_msg, attr, curr_value)
+ else:
+ logging.error("PyDict2PbStruct: unsupported type %s",
+ attr.type)
+ exit(-1)
+ return message
+
+
+def Convert(pb_spec, py_value):
+ """Converts Python native data structure to VTS VariableSecificationMessage.
+
+ Args:
+ pb_spec: VariableSpecificationMessage which captures the
+ specification of a target attribute.
+ py_value: Python value provided by a test case.
+
+ Returns:
+ Converted VariableSpecificationMessage if found, None otherwise
+ """
+ if not pb_spec:
+ logging.error("py2pb.Convert: ProtoBuf spec is None", pb_spec)
+ return None
+
+ message = CompSpecMsg.VariableSpecificationMessage()
+ message.name = pb_spec.name
+
+ if pb_spec.type == CompSpecMsg.TYPE_STRUCT:
+ PyDict2PbStruct(message, pb_spec, py_value)
+ elif attr.type == CompSpecMsg.TYPE_ENUM:
+ PyValue2PbEnum(message, pb_spec, py_value)
+ elif attr.type == CompSpecMsg.TYPE_SCALAR:
+ PyValue2PbScalar(message, pb_spec, py_value)
+ elif attr.type == CompSpecMsg.TYPE_VECTOR:
+ PyList2PbVector(message, pb_spec, py_value)
+ else:
+ logging.error("py2pb.Convert: unsupported type %s",
+ pb_spec.type)
+ exit(-1)
+
+ return message