change our tokenize to yield empty tokens

Our tokenize function currently skips empty tokens.  This means we
incorrectly accept invalid syntax with our seccomp filter like:
	close: arg0 == 1 |||||| arg0 == 2

Change the tokenizer helper to yield an empty string in this case so
we can correctly detect & reject these.  We don't have any scenarios
where we actually want to allow empty strings currently either (and
if we did, the callers could check themselves).

Bug: None
Test: unittests pass
Change-Id: I282e4e4544a24c0e5a7036b693429bdd209339cf
diff --git a/util.h b/util.h
index 9ec88ce..7ff86b8 100644
--- a/util.h
+++ b/util.h
@@ -83,6 +83,18 @@
 int parse_size(size_t *size, const char *sizespec);
 
 char *strip(char *s);
+
+/*
+ * tokenize: locate the next token in @stringp using the @delim
+ * @stringp A pointer to the string to scan for tokens
+ * @delim   The delimiter to split by
+ *
+ * Note that, unlike strtok, @delim is not a set of characters, but the full
+ * delimiter.  e.g. "a,;b,;c" with a delim of ",;" will yield ["a","b","c"].
+ *
+ * Note that, unlike strtok, this may return an empty token.  e.g. "a,,b" with
+ * strtok will yield ["a","b"], but this will yield ["a","","b"].
+ */
 char *tokenize(char **stringp, const char *delim);
 
 char *path_join(const char *external_path, const char *internal_path);