change our tokenize to yield empty tokens

Our tokenize function currently skips empty tokens.  This means we
incorrectly accept invalid syntax with our seccomp filter like:
	close: arg0 == 1 |||||| arg0 == 2

Change the tokenizer helper to yield an empty string in this case so
we can correctly detect & reject these.  We don't have any scenarios
where we actually want to allow empty strings currently either (and
if we did, the callers could check themselves).

Bug: None
Test: unittests pass
Change-Id: I282e4e4544a24c0e5a7036b693429bdd209339cf
diff --git a/util_unittest.cc b/util_unittest.cc
index b5cdff7..ec3d714 100644
--- a/util_unittest.cc
+++ b/util_unittest.cc
@@ -65,3 +65,29 @@
   ASSERT_EQ(nullptr, p);
   ASSERT_EQ(nullptr, tokenize(&p, ","));
 }
+
+// Check edge case with an empty string.
+TEST(tokenize, empty_string) {
+  char str[] = "";
+  char *p = str;
+  ASSERT_EQ("", std::string(tokenize(&p, ",")));
+  ASSERT_EQ(nullptr, p);
+  ASSERT_EQ(nullptr, tokenize(&p, ","));
+}
+
+// Check behavior with empty tokens at the start/middle/end.
+TEST(tokenize, empty_tokens) {
+  char str[] = ",,a,b,,,c,,";
+  char *p = str;
+  ASSERT_EQ("", std::string(tokenize(&p, ",")));
+  ASSERT_EQ("", std::string(tokenize(&p, ",")));
+  ASSERT_EQ("a", std::string(tokenize(&p, ",")));
+  ASSERT_EQ("b", std::string(tokenize(&p, ",")));
+  ASSERT_EQ("", std::string(tokenize(&p, ",")));
+  ASSERT_EQ("", std::string(tokenize(&p, ",")));
+  ASSERT_EQ("c", std::string(tokenize(&p, ",")));
+  ASSERT_EQ("", std::string(tokenize(&p, ",")));
+  ASSERT_EQ("", std::string(tokenize(&p, ",")));
+  ASSERT_EQ(nullptr, p);
+  ASSERT_EQ(nullptr, tokenize(&p, ","));
+}