Move SkSL caching out of GR_TEST_UTILS, trim persistent cache options

Combines the two boolean options into a single tri-state enum. Old GLSL
option is still present (temporarily) until Chrome is switched over.

Also add a type tag for cached program binaries, so we can safely
detect cache entries of the wrong type.

Change-Id: I0ddeefa9180b27bc2c46e2e7b77e6c9cdf4a730a
Reviewed-on: https://skia-review.googlesource.com/c/skia/+/238856
Commit-Queue: Brian Osman <brianosman@google.com>
Reviewed-by: Brian Salomon <bsalomon@google.com>
diff --git a/tools/gpu/MemoryCache.cpp b/tools/gpu/MemoryCache.cpp
index c3bc633..4da7fd7 100644
--- a/tools/gpu/MemoryCache.cpp
+++ b/tools/gpu/MemoryCache.cpp
@@ -96,7 +96,9 @@
         // Even with the SPIR-V switches, it seems like we must use .spv, or malisc tries to
         // run glslang on the input.
         const char* ext = GrBackendApi::kOpenGL == api ? "frag" : "spv";
-        GrPersistentCacheUtils::UnpackCachedShaders(data, shaders,
+        SkReader32 reader(data->data(), data->size());
+        reader.readU32(); // Shader type tag
+        GrPersistentCacheUtils::UnpackCachedShaders(&reader, shaders,
                                                     inputsIgnored, kGrShaderTypeCount);
 
         SkString filename = SkStringPrintf("%s/%s.%s", path, md5.c_str(), ext);