pw_tokenizer: Replace string literals with tokens

pw_tokenizer provides macros that replace printf-style string literals
with 32-bit hashes at compile time. The string literals are removed
from the resulting binary, which dramatically reduces the binary size.
Like any printf-style string, binary versions of the strings can be
formatted with arguments and then transmitted or stored.

The pw_tokenizer module is general purpose, but its most common use case
is binary logging. In binary logging, human-readable text logs are
replaced with binary tokens. These are decoded off-device.

This commit includes the C and C++ code for tokenizing strings. It also
includes a C++ library for decoding tokenized strings.

Change-Id: I6d5737ab2d6dfdd76dcf70c852b547fdcd68d683
diff --git a/pw_tokenizer/token_database.cc b/pw_tokenizer/token_database.cc
new file mode 100644
index 0000000..42c145b
--- /dev/null
+++ b/pw_tokenizer/token_database.cc
@@ -0,0 +1,41 @@
+// Copyright 2020 The Pigweed Authors
+//
+// Licensed under the Apache License, Version 2.0 (the "License"); you may not
+// use this file except in compliance with the License. You may obtain a copy of
+// the License at
+//
+//     https://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
+// WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
+// License for the specific language governing permissions and limitations under
+// the License.
+
+#include "pw_tokenizer/token_database.h"
+
+namespace pw::tokenizer {
+
+TokenDatabase::Entry TokenDatabase::Entries::operator[](size_t index) const {
+  Iterator it = begin();
+  for (size_t i = 0; i < index; ++i) {
+    ++it;
+  }
+  return it.entry();
+}
+
+TokenDatabase::Entries TokenDatabase::Find(const uint32_t token) const {
+  Iterator first = begin();
+  while (first != end() && token > first->token) {
+    ++first;
+  }
+
+  Iterator last = first;
+  while (last != end() && token == last->token) {
+    ++last;
+  }
+
+  return Entries(first, last);
+}
+
+}  // namespace pw::tokenizer