Fix evaluation of unicode character arrays (char16_t[] and char32_t[])

Suppose we have the UTF-16 string:
    char16_t[] s = u"hello";
Before this patch, evaluating the string in lldb would get:
    (char16_t [6]) $0 = ([0] = U+0068 u'h', [1] = U+0065 u'e', [2] = U+006c u'l', [3] = U+006c u'l', [4] = U+006f u'o', [5] = U+0000 u'\0')
After applying the patch, we now get:
    (char16_t [6]) $0 = u"hello"

Patch from evgeny.leviant@gmail.com
Reviewed by: granata.enrico
Subscribers: lldb-commits
Differential Revision: http://reviews.llvm.org/D13053

llvm-svn: 248555
diff --git a/lldb/source/DataFormatters/FormattersHelpers.cpp b/lldb/source/DataFormatters/FormattersHelpers.cpp
index c2fcb98..648913e 100644
--- a/lldb/source/DataFormatters/FormattersHelpers.cpp
+++ b/lldb/source/DataFormatters/FormattersHelpers.cpp
@@ -306,3 +306,16 @@
         return UINT32_MAX;
     return idx;
 }
+
+lldb::addr_t
+lldb_private::formatters::GetArrayAddressOrPointerValue (ValueObject& valobj)
+{
+    lldb::addr_t data_addr = LLDB_INVALID_ADDRESS;
+
+    if (valobj.IsPointerType())
+        data_addr = valobj.GetValueAsUnsigned(0);
+    else if (valobj.IsArrayType())
+        data_addr = valobj.GetAddressOf();
+
+    return data_addr;
+}