Within the body of a lambda expression, decltype((x)) for an
id-expression 'x' will compute the type based on the assumption that
'x' will be captured, even if it isn't captured, per C++11
[expr.prim.lambda]p18. There are two related refactors that go into
implementing this:
1) Split out the check that determines whether we should capture a
particular variable reference, along with the computation of the
type of the field, from the actual act of capturing the
variable.
2) Always compute the result of decltype() within Sema, rather than
AST, because the decltype() computation is now context-sensitive.
git-svn-id: https://llvm.org/svn/llvm-project/cfe/trunk@150347 91177308-0d34-0410-b5e6-96231b3b80d8
diff --git a/lib/Serialization/ASTReader.cpp b/lib/Serialization/ASTReader.cpp
index 312f5c7..75b5d68 100644
--- a/lib/Serialization/ASTReader.cpp
+++ b/lib/Serialization/ASTReader.cpp
@@ -3928,8 +3928,10 @@
return Context.getTypeOfType(UnderlyingType);
}
- case TYPE_DECLTYPE:
- return Context.getDecltypeType(ReadExpr(*Loc.F));
+ case TYPE_DECLTYPE: {
+ QualType UnderlyingType = readType(*Loc.F, Record, Idx);
+ return Context.getDecltypeType(ReadExpr(*Loc.F), UnderlyingType);
+ }
case TYPE_UNARY_TRANSFORM: {
QualType BaseType = readType(*Loc.F, Record, Idx);