blob: aaa8a9e6953d7f61cc828c40d8aa5fd56f34eeb5 [file] [log] [blame]
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001<html><body>
2<style>
3
4body, h1, h2, h3, div, span, p, pre, a {
5 margin: 0;
6 padding: 0;
7 border: 0;
8 font-weight: inherit;
9 font-style: inherit;
10 font-size: 100%;
11 font-family: inherit;
12 vertical-align: baseline;
13}
14
15body {
16 font-size: 13px;
17 padding: 1em;
18}
19
20h1 {
21 font-size: 26px;
22 margin-bottom: 1em;
23}
24
25h2 {
26 font-size: 24px;
27 margin-bottom: 1em;
28}
29
30h3 {
31 font-size: 20px;
32 margin-bottom: 1em;
33 margin-top: 1em;
34}
35
36pre, code {
37 line-height: 1.5;
38 font-family: Monaco, 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', 'Lucida Console', monospace;
39}
40
41pre {
42 margin-top: 0.5em;
43}
44
45h1, h2, h3, p {
46 font-family: Arial, sans serif;
47}
48
49h1, h2, h3 {
50 border-bottom: solid #CCC 1px;
51}
52
53.toc_element {
54 margin-top: 0.5em;
55}
56
57.firstline {
58 margin-left: 2 em;
59}
60
61.method {
62 margin-top: 1em;
63 border: solid 1px #CCC;
64 padding: 1em;
65 background: #EEE;
66}
67
68.details {
69 font-weight: bold;
70 font-size: 14px;
71}
72
73</style>
74
75<h1><a href="vision_v1p1beta1.html">Cloud Vision API</a> . <a href="vision_v1p1beta1.files.html">files</a></h1>
76<h2>Instance Methods</h2>
77<p class="toc_element">
Dan O'Mearadd494642020-05-01 07:42:23 -070078 <code><a href="#annotate">annotate(body=None, x__xgafv=None)</a></code></p>
Bu Sun Kim715bd7f2019-06-14 16:50:42 -070079<p class="firstline">Service that performs image detection and annotation for a batch of files.</p>
80<p class="toc_element">
Dan O'Mearadd494642020-05-01 07:42:23 -070081 <code><a href="#asyncBatchAnnotate">asyncBatchAnnotate(body=None, x__xgafv=None)</a></code></p>
Bu Sun Kim715bd7f2019-06-14 16:50:42 -070082<p class="firstline">Run asynchronous image detection and annotation for a list of generic</p>
83<h3>Method Details</h3>
84<div class="method">
Dan O'Mearadd494642020-05-01 07:42:23 -070085 <code class="details" id="annotate">annotate(body=None, x__xgafv=None)</code>
Bu Sun Kim715bd7f2019-06-14 16:50:42 -070086 <pre>Service that performs image detection and annotation for a batch of files.
Bu Sun Kim65020912020-05-20 12:08:20 -070087Now only &quot;application/pdf&quot;, &quot;image/tiff&quot; and &quot;image/gif&quot; are supported.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -070088
89This service will extract at most 5 (customers can specify which 5 in
90AnnotateFileRequest.pages) frames (gif) or pages (pdf or tiff) from each
91file provided and perform detection and annotation for each image
92extracted.
93
94Args:
Dan O'Mearadd494642020-05-01 07:42:23 -070095 body: object, The request body.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -070096 The object takes the form of:
97
98{ # A list of requests to annotate files using the BatchAnnotateFiles API.
Bu Sun Kim65020912020-05-20 12:08:20 -070099 &quot;parent&quot;: &quot;A String&quot;, # Optional. Target project and location to make a call.
100 #
101 # Format: `projects/{project-id}/locations/{location-id}`.
102 #
103 # If no parent is specified, a region will be chosen automatically.
104 #
105 # Supported location-ids:
106 # `us`: USA country only,
107 # `asia`: East asia areas, like Japan, Taiwan,
108 # `eu`: The European Union.
109 #
110 # Example: `projects/project-A/locations/eu`.
111 &quot;requests&quot;: [ # Required. The list of file annotation requests. Right now we support only one
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700112 # AnnotateFileRequest in BatchAnnotateFilesRequest.
113 { # A request to annotate one single file, e.g. a PDF, TIFF or GIF file.
Bu Sun Kim65020912020-05-20 12:08:20 -0700114 &quot;pages&quot;: [ # Pages of the file to perform image annotation.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700115 #
116 # Pages starts from 1, we assume the first page of the file is page 1.
117 # At most 5 pages are supported per request. Pages can be negative.
118 #
119 # Page 1 means the first page.
120 # Page 2 means the second page.
121 # Page -1 means the last page.
122 # Page -2 means the second to the last page.
123 #
124 # If the file is GIF instead of PDF or TIFF, page refers to GIF frames.
125 #
126 # If this field is empty, by default the service performs image annotation
127 # for the first 5 pages of the file.
128 42,
129 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700130 &quot;inputConfig&quot;: { # The desired input location and metadata. # Required. Information about the input file.
131 &quot;content&quot;: &quot;A String&quot;, # File content, represented as a stream of bytes.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700132 # Note: As with all `bytes` fields, protobuffers use a pure binary
133 # representation, whereas JSON representations use base64.
134 #
135 # Currently, this field only works for BatchAnnotateFiles requests. It does
136 # not work for AsyncBatchAnnotateFiles requests.
Bu Sun Kim65020912020-05-20 12:08:20 -0700137 &quot;gcsSource&quot;: { # The Google Cloud Storage location where the input will be read from. # The Google Cloud Storage location to read the input from.
138 &quot;uri&quot;: &quot;A String&quot;, # Google Cloud Storage URI for the input file. This must only be a
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700139 # Google Cloud Storage object. Wildcards are not currently supported.
140 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700141 &quot;mimeType&quot;: &quot;A String&quot;, # The type of the file. Currently only &quot;application/pdf&quot;, &quot;image/tiff&quot; and
142 # &quot;image/gif&quot; are supported. Wildcards are not supported.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700143 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700144 &quot;features&quot;: [ # Required. Requested features.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700145 { # The type of Google Cloud Vision API detection to perform, and the maximum
146 # number of results to return for that type. Multiple `Feature` objects can
147 # be specified in the `features` list.
Bu Sun Kim65020912020-05-20 12:08:20 -0700148 &quot;type&quot;: &quot;A String&quot;, # The feature type.
149 &quot;maxResults&quot;: 42, # Maximum number of results of this type. Does not apply to
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700150 # `TEXT_DETECTION`, `DOCUMENT_TEXT_DETECTION`, or `CROP_HINTS`.
Bu Sun Kim65020912020-05-20 12:08:20 -0700151 &quot;model&quot;: &quot;A String&quot;, # Model to use for the feature.
152 # Supported values: &quot;builtin/stable&quot; (the default if unset) and
153 # &quot;builtin/latest&quot;.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700154 },
155 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700156 &quot;imageContext&quot;: { # Image context and/or feature-specific parameters. # Additional context that may accompany the image(s) in the file.
157 &quot;languageHints&quot;: [ # List of languages to use for TEXT_DETECTION. In most cases, an empty value
158 # yields the best results since it enables automatic language detection. For
159 # languages based on the Latin alphabet, setting `language_hints` is not
160 # needed. In rare cases, when the language of the text in the image is known,
161 # setting a hint will help get better results (although it will be a
162 # significant hindrance if the hint is wrong). Text detection returns an
163 # error if one or more of the specified languages is not one of the
164 # [supported languages](https://cloud.google.com/vision/docs/languages).
165 &quot;A String&quot;,
166 ],
167 &quot;webDetectionParams&quot;: { # Parameters for web detection request. # Parameters for web detection.
168 &quot;includeGeoResults&quot;: True or False, # Whether to include results derived from the geo information in the image.
169 },
170 &quot;latLongRect&quot;: { # Rectangle determined by min and max `LatLng` pairs. # Not used.
171 &quot;minLatLng&quot;: { # An object representing a latitude/longitude pair. This is expressed as a pair # Min lat/long pair.
172 # of doubles representing degrees latitude and degrees longitude. Unless
173 # specified otherwise, this must conform to the
174 # &lt;a href=&quot;http://www.unoosa.org/pdf/icg/2012/template/WGS_84.pdf&quot;&gt;WGS84
175 # standard&lt;/a&gt;. Values must be within normalized ranges.
176 &quot;latitude&quot;: 3.14, # The latitude in degrees. It must be in the range [-90.0, +90.0].
177 &quot;longitude&quot;: 3.14, # The longitude in degrees. It must be in the range [-180.0, +180.0].
178 },
179 &quot;maxLatLng&quot;: { # An object representing a latitude/longitude pair. This is expressed as a pair # Max lat/long pair.
180 # of doubles representing degrees latitude and degrees longitude. Unless
181 # specified otherwise, this must conform to the
182 # &lt;a href=&quot;http://www.unoosa.org/pdf/icg/2012/template/WGS_84.pdf&quot;&gt;WGS84
183 # standard&lt;/a&gt;. Values must be within normalized ranges.
184 &quot;latitude&quot;: 3.14, # The latitude in degrees. It must be in the range [-90.0, +90.0].
185 &quot;longitude&quot;: 3.14, # The longitude in degrees. It must be in the range [-180.0, +180.0].
186 },
187 },
188 &quot;cropHintsParams&quot;: { # Parameters for crop hints annotation request. # Parameters for crop hints annotation request.
189 &quot;aspectRatios&quot;: [ # Aspect ratios in floats, representing the ratio of the width to the height
190 # of the image. For example, if the desired aspect ratio is 4/3, the
191 # corresponding float value should be 1.33333. If not specified, the
192 # best possible crop is returned. The number of provided aspect ratios is
193 # limited to a maximum of 16; any aspect ratios provided after the 16th are
194 # ignored.
195 3.14,
196 ],
197 },
198 &quot;productSearchParams&quot;: { # Parameters for a product search request. # Parameters for product search.
199 &quot;filter&quot;: &quot;A String&quot;, # The filtering expression. This can be used to restrict search results based
200 # on Product labels. We currently support an AND of OR of key-value
201 # expressions, where each expression within an OR must have the same key. An
202 # &#x27;=&#x27; should be used to connect the key and value.
203 #
204 # For example, &quot;(color = red OR color = blue) AND brand = Google&quot; is
205 # acceptable, but &quot;(color = red OR brand = Google)&quot; is not acceptable.
206 # &quot;color: red&quot; is not acceptable because it uses a &#x27;:&#x27; instead of an &#x27;=&#x27;.
207 &quot;productSet&quot;: &quot;A String&quot;, # The resource name of a ProductSet to be searched for similar images.
208 #
209 # Format is:
210 # `projects/PROJECT_ID/locations/LOC_ID/productSets/PRODUCT_SET_ID`.
211 &quot;boundingPoly&quot;: { # A bounding polygon for the detected image annotation. # The bounding polygon around the area of interest in the image.
212 # If it is not specified, system discretion will be applied.
213 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
214 { # A vertex represents a 2D point in the image.
215 # NOTE: the normalized vertex coordinates are relative to the original image
216 # and range from 0 to 1.
217 &quot;y&quot;: 3.14, # Y coordinate.
218 &quot;x&quot;: 3.14, # X coordinate.
219 },
220 ],
221 &quot;vertices&quot;: [ # The bounding polygon vertices.
222 { # A vertex represents a 2D point in the image.
223 # NOTE: the vertex coordinates are in the same scale as the original image.
224 &quot;y&quot;: 42, # Y coordinate.
225 &quot;x&quot;: 42, # X coordinate.
226 },
227 ],
228 },
229 &quot;productCategories&quot;: [ # The list of product categories to search in. Currently, we only consider
230 # the first category, and either &quot;homegoods-v2&quot;, &quot;apparel-v2&quot;, &quot;toys-v2&quot;,
231 # &quot;packagedgoods-v1&quot;, or &quot;general-v1&quot; should be specified. The legacy
232 # categories &quot;homegoods&quot;, &quot;apparel&quot;, and &quot;toys&quot; are still supported but will
233 # be deprecated. For new products, please use &quot;homegoods-v2&quot;, &quot;apparel-v2&quot;,
234 # or &quot;toys-v2&quot; for better product search accuracy. It is recommended to
235 # migrate existing products to these categories as well.
236 &quot;A String&quot;,
237 ],
238 },
239 },
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700240 },
241 ],
242 }
243
244 x__xgafv: string, V1 error format.
245 Allowed values
246 1 - v1 error format
247 2 - v2 error format
248
249Returns:
250 An object of the form:
251
252 { # A list of file annotation responses.
Bu Sun Kim65020912020-05-20 12:08:20 -0700253 &quot;responses&quot;: [ # The list of file annotation responses, each response corresponding to each
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700254 # AnnotateFileRequest in BatchAnnotateFilesRequest.
255 { # Response to a single file annotation request. A file may contain one or more
256 # images, which individually have their own responses.
Bu Sun Kim65020912020-05-20 12:08:20 -0700257 &quot;error&quot;: { # The `Status` type defines a logical error model that is suitable for # If set, represents the error message for the failed request. The
258 # `responses` field will not be set in this case.
259 # different programming environments, including REST APIs and RPC APIs. It is
260 # used by [gRPC](https://github.com/grpc). Each `Status` message contains
261 # three pieces of data: error code, error message, and error details.
262 #
263 # You can find out more about this error model and how to work with it in the
264 # [API Design Guide](https://cloud.google.com/apis/design/errors).
265 &quot;code&quot;: 42, # The status code, which should be an enum value of google.rpc.Code.
266 &quot;message&quot;: &quot;A String&quot;, # A developer-facing error message, which should be in English. Any
267 # user-facing error message should be localized and sent in the
268 # google.rpc.Status.details field, or localized by the client.
269 &quot;details&quot;: [ # A list of messages that carry the error details. There is a common set of
270 # message types for APIs to use.
271 {
272 &quot;a_key&quot;: &quot;&quot;, # Properties of the object. Contains field @type with type URL.
273 },
274 ],
Dan O'Mearadd494642020-05-01 07:42:23 -0700275 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700276 &quot;responses&quot;: [ # Individual responses to images found within the file. This field will be
Dan O'Mearadd494642020-05-01 07:42:23 -0700277 # empty if the `error` field is set.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700278 { # Response to an image annotation request.
Bu Sun Kim65020912020-05-20 12:08:20 -0700279 &quot;landmarkAnnotations&quot;: [ # If present, landmark detection has completed successfully.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700280 { # Set of detected entity features.
Bu Sun Kim65020912020-05-20 12:08:20 -0700281 &quot;description&quot;: &quot;A String&quot;, # Entity textual description, expressed in its `locale` language.
282 &quot;topicality&quot;: 3.14, # The relevancy of the ICA (Image Content Annotation) label to the
283 # image. For example, the relevancy of &quot;tower&quot; is likely higher to an image
284 # containing the detected &quot;Eiffel Tower&quot; than to an image containing a
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700285 # detected distant towering building, even though the confidence that
286 # there is a tower in each image may be the same. Range [0, 1].
Bu Sun Kim65020912020-05-20 12:08:20 -0700287 &quot;properties&quot;: [ # Some entities may have optional user-supplied `Property` (name/value)
288 # fields, such a score or string that qualifies the entity.
289 { # A `Property` consists of a user-supplied name/value pair.
290 &quot;uint64Value&quot;: &quot;A String&quot;, # Value of numeric properties.
291 &quot;name&quot;: &quot;A String&quot;, # Name of the property.
292 &quot;value&quot;: &quot;A String&quot;, # Value of the property.
293 },
294 ],
295 &quot;score&quot;: 3.14, # Overall score of the result. Range [0, 1].
296 &quot;locations&quot;: [ # The location information for the detected entity. Multiple
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700297 # `LocationInfo` elements can be present because one location may
298 # indicate the location of the scene in the image, and another location
299 # may indicate the location of the place where the image was taken.
300 # Location information is usually present for landmarks.
301 { # Detected entity location information.
Bu Sun Kim65020912020-05-20 12:08:20 -0700302 &quot;latLng&quot;: { # An object representing a latitude/longitude pair. This is expressed as a pair # lat/long location coordinates.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700303 # of doubles representing degrees latitude and degrees longitude. Unless
304 # specified otherwise, this must conform to the
Bu Sun Kim65020912020-05-20 12:08:20 -0700305 # &lt;a href=&quot;http://www.unoosa.org/pdf/icg/2012/template/WGS_84.pdf&quot;&gt;WGS84
Dan O'Mearadd494642020-05-01 07:42:23 -0700306 # standard&lt;/a&gt;. Values must be within normalized ranges.
Bu Sun Kim65020912020-05-20 12:08:20 -0700307 &quot;latitude&quot;: 3.14, # The latitude in degrees. It must be in the range [-90.0, +90.0].
308 &quot;longitude&quot;: 3.14, # The longitude in degrees. It must be in the range [-180.0, +180.0].
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700309 },
310 },
311 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700312 &quot;mid&quot;: &quot;A String&quot;, # Opaque entity ID. Some IDs may be available in
313 # [Google Knowledge Graph Search
314 # API](https://developers.google.com/knowledge-graph/).
315 &quot;confidence&quot;: 3.14, # **Deprecated. Use `score` instead.**
316 # The accuracy of the entity detection in an image.
317 # For example, for an image in which the &quot;Eiffel Tower&quot; entity is detected,
318 # this field represents the confidence that there is a tower in the query
319 # image. Range [0, 1].
320 &quot;boundingPoly&quot;: { # A bounding polygon for the detected image annotation. # Image region to which this entity belongs. Not produced
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700321 # for `LABEL_DETECTION` features.
Bu Sun Kim65020912020-05-20 12:08:20 -0700322 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700323 { # A vertex represents a 2D point in the image.
324 # NOTE: the normalized vertex coordinates are relative to the original image
325 # and range from 0 to 1.
Bu Sun Kim65020912020-05-20 12:08:20 -0700326 &quot;y&quot;: 3.14, # Y coordinate.
327 &quot;x&quot;: 3.14, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700328 },
329 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700330 &quot;vertices&quot;: [ # The bounding polygon vertices.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700331 { # A vertex represents a 2D point in the image.
332 # NOTE: the vertex coordinates are in the same scale as the original image.
Bu Sun Kim65020912020-05-20 12:08:20 -0700333 &quot;y&quot;: 42, # Y coordinate.
334 &quot;x&quot;: 42, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700335 },
336 ],
337 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700338 &quot;locale&quot;: &quot;A String&quot;, # The language code for the locale in which the entity textual
339 # `description` is expressed.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700340 },
341 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700342 &quot;faceAnnotations&quot;: [ # If present, face detection has completed successfully.
343 { # A face annotation object contains the results of face detection.
344 &quot;angerLikelihood&quot;: &quot;A String&quot;, # Anger likelihood.
345 &quot;landmarks&quot;: [ # Detected face landmarks.
346 { # A face-specific landmark (for example, a face feature).
347 &quot;position&quot;: { # A 3D position in the image, used primarily for Face detection landmarks. # Face landmark position.
348 # A valid Position must have both x and y coordinates.
349 # The position coordinates are in the same scale as the original image.
350 &quot;x&quot;: 3.14, # X coordinate.
351 &quot;z&quot;: 3.14, # Z coordinate (or depth).
352 &quot;y&quot;: 3.14, # Y coordinate.
353 },
354 &quot;type&quot;: &quot;A String&quot;, # Face landmark type.
355 },
356 ],
357 &quot;surpriseLikelihood&quot;: &quot;A String&quot;, # Surprise likelihood.
358 &quot;joyLikelihood&quot;: &quot;A String&quot;, # Joy likelihood.
359 &quot;landmarkingConfidence&quot;: 3.14, # Face landmarking confidence. Range [0, 1].
360 &quot;detectionConfidence&quot;: 3.14, # Detection confidence. Range [0, 1].
361 &quot;panAngle&quot;: 3.14, # Yaw angle, which indicates the leftward/rightward angle that the face is
362 # pointing relative to the vertical plane perpendicular to the image. Range
363 # [-180,180].
364 &quot;underExposedLikelihood&quot;: &quot;A String&quot;, # Under-exposed likelihood.
365 &quot;blurredLikelihood&quot;: &quot;A String&quot;, # Blurred likelihood.
366 &quot;headwearLikelihood&quot;: &quot;A String&quot;, # Headwear likelihood.
367 &quot;boundingPoly&quot;: { # A bounding polygon for the detected image annotation. # The bounding polygon around the face. The coordinates of the bounding box
368 # are in the original image&#x27;s scale.
369 # The bounding box is computed to &quot;frame&quot; the face in accordance with human
370 # expectations. It is based on the landmarker results.
371 # Note that one or more x and/or y coordinates may not be generated in the
372 # `BoundingPoly` (the polygon will be unbounded) if only a partial face
373 # appears in the image to be annotated.
374 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
375 { # A vertex represents a 2D point in the image.
376 # NOTE: the normalized vertex coordinates are relative to the original image
377 # and range from 0 to 1.
378 &quot;y&quot;: 3.14, # Y coordinate.
379 &quot;x&quot;: 3.14, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700380 },
381 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700382 &quot;vertices&quot;: [ # The bounding polygon vertices.
383 { # A vertex represents a 2D point in the image.
384 # NOTE: the vertex coordinates are in the same scale as the original image.
385 &quot;y&quot;: 42, # Y coordinate.
386 &quot;x&quot;: 42, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700387 },
388 ],
389 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700390 &quot;rollAngle&quot;: 3.14, # Roll angle, which indicates the amount of clockwise/anti-clockwise rotation
391 # of the face relative to the image vertical about the axis perpendicular to
392 # the face. Range [-180,180].
393 &quot;sorrowLikelihood&quot;: &quot;A String&quot;, # Sorrow likelihood.
394 &quot;tiltAngle&quot;: 3.14, # Pitch angle, which indicates the upwards/downwards angle that the face is
395 # pointing relative to the image&#x27;s horizontal plane. Range [-180,180].
396 &quot;fdBoundingPoly&quot;: { # A bounding polygon for the detected image annotation. # The `fd_bounding_poly` bounding polygon is tighter than the
397 # `boundingPoly`, and encloses only the skin part of the face. Typically, it
398 # is used to eliminate the face from any image analysis that detects the
399 # &quot;amount of skin&quot; visible in an image. It is not based on the
400 # landmarker results, only on the initial face detection, hence
401 # the &lt;code&gt;fd&lt;/code&gt; (face detection) prefix.
402 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
403 { # A vertex represents a 2D point in the image.
404 # NOTE: the normalized vertex coordinates are relative to the original image
405 # and range from 0 to 1.
406 &quot;y&quot;: 3.14, # Y coordinate.
407 &quot;x&quot;: 3.14, # X coordinate.
408 },
409 ],
410 &quot;vertices&quot;: [ # The bounding polygon vertices.
411 { # A vertex represents a 2D point in the image.
412 # NOTE: the vertex coordinates are in the same scale as the original image.
413 &quot;y&quot;: 42, # Y coordinate.
414 &quot;x&quot;: 42, # X coordinate.
415 },
416 ],
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700417 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700418 },
419 ],
420 &quot;cropHintsAnnotation&quot;: { # Set of crop hints that are used to generate new crops when serving images. # If present, crop hints have completed successfully.
421 &quot;cropHints&quot;: [ # Crop hint results.
422 { # Single crop hint that is used to generate a new crop when serving an image.
423 &quot;confidence&quot;: 3.14, # Confidence of this being a salient region. Range [0, 1].
424 &quot;importanceFraction&quot;: 3.14, # Fraction of importance of this salient region with respect to the original
425 # image.
426 &quot;boundingPoly&quot;: { # A bounding polygon for the detected image annotation. # The bounding polygon for the crop region. The coordinates of the bounding
427 # box are in the original image&#x27;s scale.
428 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
429 { # A vertex represents a 2D point in the image.
430 # NOTE: the normalized vertex coordinates are relative to the original image
431 # and range from 0 to 1.
432 &quot;y&quot;: 3.14, # Y coordinate.
433 &quot;x&quot;: 3.14, # X coordinate.
434 },
435 ],
436 &quot;vertices&quot;: [ # The bounding polygon vertices.
437 { # A vertex represents a 2D point in the image.
438 # NOTE: the vertex coordinates are in the same scale as the original image.
439 &quot;y&quot;: 42, # Y coordinate.
440 &quot;x&quot;: 42, # X coordinate.
441 },
442 ],
443 },
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700444 },
445 ],
446 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700447 &quot;labelAnnotations&quot;: [ # If present, label detection has completed successfully.
448 { # Set of detected entity features.
449 &quot;description&quot;: &quot;A String&quot;, # Entity textual description, expressed in its `locale` language.
450 &quot;topicality&quot;: 3.14, # The relevancy of the ICA (Image Content Annotation) label to the
451 # image. For example, the relevancy of &quot;tower&quot; is likely higher to an image
452 # containing the detected &quot;Eiffel Tower&quot; than to an image containing a
453 # detected distant towering building, even though the confidence that
454 # there is a tower in each image may be the same. Range [0, 1].
455 &quot;properties&quot;: [ # Some entities may have optional user-supplied `Property` (name/value)
456 # fields, such a score or string that qualifies the entity.
457 { # A `Property` consists of a user-supplied name/value pair.
458 &quot;uint64Value&quot;: &quot;A String&quot;, # Value of numeric properties.
459 &quot;name&quot;: &quot;A String&quot;, # Name of the property.
460 &quot;value&quot;: &quot;A String&quot;, # Value of the property.
461 },
462 ],
463 &quot;score&quot;: 3.14, # Overall score of the result. Range [0, 1].
464 &quot;locations&quot;: [ # The location information for the detected entity. Multiple
465 # `LocationInfo` elements can be present because one location may
466 # indicate the location of the scene in the image, and another location
467 # may indicate the location of the place where the image was taken.
468 # Location information is usually present for landmarks.
469 { # Detected entity location information.
470 &quot;latLng&quot;: { # An object representing a latitude/longitude pair. This is expressed as a pair # lat/long location coordinates.
471 # of doubles representing degrees latitude and degrees longitude. Unless
472 # specified otherwise, this must conform to the
473 # &lt;a href=&quot;http://www.unoosa.org/pdf/icg/2012/template/WGS_84.pdf&quot;&gt;WGS84
474 # standard&lt;/a&gt;. Values must be within normalized ranges.
475 &quot;latitude&quot;: 3.14, # The latitude in degrees. It must be in the range [-90.0, +90.0].
476 &quot;longitude&quot;: 3.14, # The longitude in degrees. It must be in the range [-180.0, +180.0].
477 },
478 },
479 ],
480 &quot;mid&quot;: &quot;A String&quot;, # Opaque entity ID. Some IDs may be available in
481 # [Google Knowledge Graph Search
482 # API](https://developers.google.com/knowledge-graph/).
483 &quot;confidence&quot;: 3.14, # **Deprecated. Use `score` instead.**
484 # The accuracy of the entity detection in an image.
485 # For example, for an image in which the &quot;Eiffel Tower&quot; entity is detected,
486 # this field represents the confidence that there is a tower in the query
487 # image. Range [0, 1].
488 &quot;boundingPoly&quot;: { # A bounding polygon for the detected image annotation. # Image region to which this entity belongs. Not produced
489 # for `LABEL_DETECTION` features.
490 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700491 { # A vertex represents a 2D point in the image.
492 # NOTE: the normalized vertex coordinates are relative to the original image
493 # and range from 0 to 1.
Bu Sun Kim65020912020-05-20 12:08:20 -0700494 &quot;y&quot;: 3.14, # Y coordinate.
495 &quot;x&quot;: 3.14, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700496 },
497 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700498 &quot;vertices&quot;: [ # The bounding polygon vertices.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700499 { # A vertex represents a 2D point in the image.
500 # NOTE: the vertex coordinates are in the same scale as the original image.
Bu Sun Kim65020912020-05-20 12:08:20 -0700501 &quot;y&quot;: 42, # Y coordinate.
502 &quot;x&quot;: 42, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700503 },
504 ],
505 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700506 &quot;locale&quot;: &quot;A String&quot;, # The language code for the locale in which the entity textual
507 # `description` is expressed.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700508 },
509 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700510 &quot;productSearchResults&quot;: { # Results for a product search request. # If present, product search has completed successfully.
511 &quot;productGroupedResults&quot;: [ # List of results grouped by products detected in the query image. Each entry
512 # corresponds to one bounding polygon in the query image, and contains the
513 # matching products specific to that region. There may be duplicate product
514 # matches in the union of all the per-product results.
515 { # Information about the products similar to a single product in a query
516 # image.
517 &quot;objectAnnotations&quot;: [ # List of generic predictions for the object in the bounding box.
518 { # Prediction for what the object in the bounding box is.
519 &quot;languageCode&quot;: &quot;A String&quot;, # The BCP-47 language code, such as &quot;en-US&quot; or &quot;sr-Latn&quot;. For more
520 # information, see
521 # http://www.unicode.org/reports/tr35/#Unicode_locale_identifier.
522 &quot;mid&quot;: &quot;A String&quot;, # Object ID that should align with EntityAnnotation mid.
523 &quot;name&quot;: &quot;A String&quot;, # Object name, expressed in its `language_code` language.
524 &quot;score&quot;: 3.14, # Score of the result. Range [0, 1].
525 },
526 ],
527 &quot;boundingPoly&quot;: { # A bounding polygon for the detected image annotation. # The bounding polygon around the product detected in the query image.
528 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
529 { # A vertex represents a 2D point in the image.
530 # NOTE: the normalized vertex coordinates are relative to the original image
531 # and range from 0 to 1.
532 &quot;y&quot;: 3.14, # Y coordinate.
533 &quot;x&quot;: 3.14, # X coordinate.
534 },
535 ],
536 &quot;vertices&quot;: [ # The bounding polygon vertices.
537 { # A vertex represents a 2D point in the image.
538 # NOTE: the vertex coordinates are in the same scale as the original image.
539 &quot;y&quot;: 42, # Y coordinate.
540 &quot;x&quot;: 42, # X coordinate.
541 },
542 ],
543 },
544 &quot;results&quot;: [ # List of results, one for each product match.
545 { # Information about a product.
546 &quot;image&quot;: &quot;A String&quot;, # The resource name of the image from the product that is the closest match
547 # to the query.
548 &quot;product&quot;: { # A Product contains ReferenceImages. # The Product.
549 &quot;displayName&quot;: &quot;A String&quot;, # The user-provided name for this Product. Must not be empty. Must be at most
550 # 4096 characters long.
551 &quot;description&quot;: &quot;A String&quot;, # User-provided metadata to be stored with this product. Must be at most 4096
552 # characters long.
553 &quot;productCategory&quot;: &quot;A String&quot;, # Immutable. The category for the product identified by the reference image. This should
554 # be either &quot;homegoods-v2&quot;, &quot;apparel-v2&quot;, or &quot;toys-v2&quot;. The legacy categories
555 # &quot;homegoods&quot;, &quot;apparel&quot;, and &quot;toys&quot; are still supported, but these should
556 # not be used for new products.
557 &quot;productLabels&quot;: [ # Key-value pairs that can be attached to a product. At query time,
558 # constraints can be specified based on the product_labels.
559 #
560 # Note that integer values can be provided as strings, e.g. &quot;1199&quot;. Only
561 # strings with integer values can match a range-based restriction which is
562 # to be supported soon.
563 #
564 # Multiple values can be assigned to the same key. One product may have up to
565 # 500 product_labels.
566 #
567 # Notice that the total number of distinct product_labels over all products
568 # in one ProductSet cannot exceed 1M, otherwise the product search pipeline
569 # will refuse to work for that ProductSet.
570 { # A product label represented as a key-value pair.
571 &quot;key&quot;: &quot;A String&quot;, # The key of the label attached to the product. Cannot be empty and cannot
572 # exceed 128 bytes.
573 &quot;value&quot;: &quot;A String&quot;, # The value of the label attached to the product. Cannot be empty and
574 # cannot exceed 128 bytes.
575 },
576 ],
577 &quot;name&quot;: &quot;A String&quot;, # The resource name of the product.
578 #
579 # Format is:
580 # `projects/PROJECT_ID/locations/LOC_ID/products/PRODUCT_ID`.
581 #
582 # This field is ignored when creating a product.
583 },
584 &quot;score&quot;: 3.14, # A confidence level on the match, ranging from 0 (no confidence) to
585 # 1 (full confidence).
586 },
587 ],
588 },
589 ],
590 &quot;results&quot;: [ # List of results, one for each product match.
591 { # Information about a product.
592 &quot;image&quot;: &quot;A String&quot;, # The resource name of the image from the product that is the closest match
593 # to the query.
594 &quot;product&quot;: { # A Product contains ReferenceImages. # The Product.
595 &quot;displayName&quot;: &quot;A String&quot;, # The user-provided name for this Product. Must not be empty. Must be at most
596 # 4096 characters long.
597 &quot;description&quot;: &quot;A String&quot;, # User-provided metadata to be stored with this product. Must be at most 4096
598 # characters long.
599 &quot;productCategory&quot;: &quot;A String&quot;, # Immutable. The category for the product identified by the reference image. This should
600 # be either &quot;homegoods-v2&quot;, &quot;apparel-v2&quot;, or &quot;toys-v2&quot;. The legacy categories
601 # &quot;homegoods&quot;, &quot;apparel&quot;, and &quot;toys&quot; are still supported, but these should
602 # not be used for new products.
603 &quot;productLabels&quot;: [ # Key-value pairs that can be attached to a product. At query time,
604 # constraints can be specified based on the product_labels.
605 #
606 # Note that integer values can be provided as strings, e.g. &quot;1199&quot;. Only
607 # strings with integer values can match a range-based restriction which is
608 # to be supported soon.
609 #
610 # Multiple values can be assigned to the same key. One product may have up to
611 # 500 product_labels.
612 #
613 # Notice that the total number of distinct product_labels over all products
614 # in one ProductSet cannot exceed 1M, otherwise the product search pipeline
615 # will refuse to work for that ProductSet.
616 { # A product label represented as a key-value pair.
617 &quot;key&quot;: &quot;A String&quot;, # The key of the label attached to the product. Cannot be empty and cannot
618 # exceed 128 bytes.
619 &quot;value&quot;: &quot;A String&quot;, # The value of the label attached to the product. Cannot be empty and
620 # cannot exceed 128 bytes.
621 },
622 ],
623 &quot;name&quot;: &quot;A String&quot;, # The resource name of the product.
624 #
625 # Format is:
626 # `projects/PROJECT_ID/locations/LOC_ID/products/PRODUCT_ID`.
627 #
628 # This field is ignored when creating a product.
629 },
630 &quot;score&quot;: 3.14, # A confidence level on the match, ranging from 0 (no confidence) to
631 # 1 (full confidence).
632 },
633 ],
634 &quot;indexTime&quot;: &quot;A String&quot;, # Timestamp of the index which provided these results. Products added to the
635 # product set and products removed from the product set after this time are
636 # not reflected in the current results.
637 },
638 &quot;localizedObjectAnnotations&quot;: [ # If present, localized object detection has completed successfully.
639 # This will be sorted descending by confidence score.
640 { # Set of detected objects with bounding boxes.
641 &quot;languageCode&quot;: &quot;A String&quot;, # The BCP-47 language code, such as &quot;en-US&quot; or &quot;sr-Latn&quot;. For more
642 # information, see
643 # http://www.unicode.org/reports/tr35/#Unicode_locale_identifier.
644 &quot;mid&quot;: &quot;A String&quot;, # Object ID that should align with EntityAnnotation mid.
645 &quot;name&quot;: &quot;A String&quot;, # Object name, expressed in its `language_code` language.
646 &quot;boundingPoly&quot;: { # A bounding polygon for the detected image annotation. # Image region to which this object belongs. This must be populated.
647 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
648 { # A vertex represents a 2D point in the image.
649 # NOTE: the normalized vertex coordinates are relative to the original image
650 # and range from 0 to 1.
651 &quot;y&quot;: 3.14, # Y coordinate.
652 &quot;x&quot;: 3.14, # X coordinate.
653 },
654 ],
655 &quot;vertices&quot;: [ # The bounding polygon vertices.
656 { # A vertex represents a 2D point in the image.
657 # NOTE: the vertex coordinates are in the same scale as the original image.
658 &quot;y&quot;: 42, # Y coordinate.
659 &quot;x&quot;: 42, # X coordinate.
660 },
661 ],
662 },
663 &quot;score&quot;: 3.14, # Score of the result. Range [0, 1].
664 },
665 ],
666 &quot;error&quot;: { # The `Status` type defines a logical error model that is suitable for # If set, represents the error message for the operation.
667 # Note that filled-in image annotations are guaranteed to be
668 # correct, even when `error` is set.
669 # different programming environments, including REST APIs and RPC APIs. It is
670 # used by [gRPC](https://github.com/grpc). Each `Status` message contains
671 # three pieces of data: error code, error message, and error details.
672 #
673 # You can find out more about this error model and how to work with it in the
674 # [API Design Guide](https://cloud.google.com/apis/design/errors).
675 &quot;code&quot;: 42, # The status code, which should be an enum value of google.rpc.Code.
676 &quot;message&quot;: &quot;A String&quot;, # A developer-facing error message, which should be in English. Any
677 # user-facing error message should be localized and sent in the
678 # google.rpc.Status.details field, or localized by the client.
679 &quot;details&quot;: [ # A list of messages that carry the error details. There is a common set of
680 # message types for APIs to use.
681 {
682 &quot;a_key&quot;: &quot;&quot;, # Properties of the object. Contains field @type with type URL.
683 },
684 ],
685 },
686 &quot;fullTextAnnotation&quot;: { # TextAnnotation contains a structured representation of OCR extracted text. # If present, text (OCR) detection or document (OCR) text detection has
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700687 # completed successfully.
688 # This annotation provides the structural hierarchy for the OCR detected
689 # text.
690 # The hierarchy of an OCR extracted text structure is like this:
Dan O'Mearadd494642020-05-01 07:42:23 -0700691 # TextAnnotation -&gt; Page -&gt; Block -&gt; Paragraph -&gt; Word -&gt; Symbol
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700692 # Each structural component, starting from Page, may further have their own
693 # properties. Properties describe detected languages, breaks etc.. Please refer
694 # to the TextAnnotation.TextProperty message definition below for more
695 # detail.
Bu Sun Kim65020912020-05-20 12:08:20 -0700696 &quot;pages&quot;: [ # List of pages detected by OCR.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700697 { # Detected page from OCR.
Bu Sun Kim65020912020-05-20 12:08:20 -0700698 &quot;confidence&quot;: 3.14, # Confidence of the OCR results on the page. Range [0, 1].
699 &quot;height&quot;: 42, # Page height. For PDFs the unit is points. For images (including
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700700 # TIFFs) the unit is pixels.
Bu Sun Kim65020912020-05-20 12:08:20 -0700701 &quot;width&quot;: 42, # Page width. For PDFs the unit is points. For images (including
702 # TIFFs) the unit is pixels.
703 &quot;blocks&quot;: [ # List of blocks of text, images etc on this page.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700704 { # Logical element on the page.
Bu Sun Kim65020912020-05-20 12:08:20 -0700705 &quot;property&quot;: { # Additional information detected on the structural component. # Additional information detected for the block.
706 &quot;detectedLanguages&quot;: [ # A list of detected languages together with confidence.
707 { # Detected language for a structural component.
708 &quot;languageCode&quot;: &quot;A String&quot;, # The BCP-47 language code, such as &quot;en-US&quot; or &quot;sr-Latn&quot;. For more
709 # information, see
710 # http://www.unicode.org/reports/tr35/#Unicode_locale_identifier.
711 &quot;confidence&quot;: 3.14, # Confidence of detected language. Range [0, 1].
712 },
713 ],
714 &quot;detectedBreak&quot;: { # Detected start or end of a structural component. # Detected start or end of a text segment.
715 &quot;type&quot;: &quot;A String&quot;, # Detected break type.
716 &quot;isPrefix&quot;: True or False, # True if break prepends the element.
717 },
718 },
719 &quot;blockType&quot;: &quot;A String&quot;, # Detected block type (text, image etc) for this block.
720 &quot;boundingBox&quot;: { # A bounding polygon for the detected image annotation. # The bounding box for the block.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700721 # The vertices are in the order of top-left, top-right, bottom-right,
722 # bottom-left. When a rotation of the bounding box is detected the rotation
723 # is represented as around the top-left corner as defined when the text is
Bu Sun Kim65020912020-05-20 12:08:20 -0700724 # read in the &#x27;natural&#x27; orientation.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700725 # For example:
726 #
727 # * when the text is horizontal it might look like:
728 #
729 # 0----1
730 # | |
731 # 3----2
732 #
Bu Sun Kim65020912020-05-20 12:08:20 -0700733 # * when it&#x27;s rotated 180 degrees around the top-left corner it becomes:
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700734 #
735 # 2----3
736 # | |
737 # 1----0
738 #
739 # and the vertex order will still be (0, 1, 2, 3).
Bu Sun Kim65020912020-05-20 12:08:20 -0700740 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700741 { # A vertex represents a 2D point in the image.
742 # NOTE: the normalized vertex coordinates are relative to the original image
743 # and range from 0 to 1.
Bu Sun Kim65020912020-05-20 12:08:20 -0700744 &quot;y&quot;: 3.14, # Y coordinate.
745 &quot;x&quot;: 3.14, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700746 },
747 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700748 &quot;vertices&quot;: [ # The bounding polygon vertices.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700749 { # A vertex represents a 2D point in the image.
750 # NOTE: the vertex coordinates are in the same scale as the original image.
Bu Sun Kim65020912020-05-20 12:08:20 -0700751 &quot;y&quot;: 42, # Y coordinate.
752 &quot;x&quot;: 42, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700753 },
754 ],
755 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700756 &quot;confidence&quot;: 3.14, # Confidence of the OCR results on the block. Range [0, 1].
757 &quot;paragraphs&quot;: [ # List of paragraphs in this block (if this blocks is of type text).
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700758 { # Structural unit of text representing a number of words in certain order.
Bu Sun Kim65020912020-05-20 12:08:20 -0700759 &quot;property&quot;: { # Additional information detected on the structural component. # Additional information detected for the paragraph.
760 &quot;detectedLanguages&quot;: [ # A list of detected languages together with confidence.
761 { # Detected language for a structural component.
762 &quot;languageCode&quot;: &quot;A String&quot;, # The BCP-47 language code, such as &quot;en-US&quot; or &quot;sr-Latn&quot;. For more
763 # information, see
764 # http://www.unicode.org/reports/tr35/#Unicode_locale_identifier.
765 &quot;confidence&quot;: 3.14, # Confidence of detected language. Range [0, 1].
766 },
767 ],
768 &quot;detectedBreak&quot;: { # Detected start or end of a structural component. # Detected start or end of a text segment.
769 &quot;type&quot;: &quot;A String&quot;, # Detected break type.
770 &quot;isPrefix&quot;: True or False, # True if break prepends the element.
771 },
772 },
773 &quot;boundingBox&quot;: { # A bounding polygon for the detected image annotation. # The bounding box for the paragraph.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700774 # The vertices are in the order of top-left, top-right, bottom-right,
775 # bottom-left. When a rotation of the bounding box is detected the rotation
776 # is represented as around the top-left corner as defined when the text is
Bu Sun Kim65020912020-05-20 12:08:20 -0700777 # read in the &#x27;natural&#x27; orientation.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700778 # For example:
779 # * when the text is horizontal it might look like:
780 # 0----1
781 # | |
782 # 3----2
Bu Sun Kim65020912020-05-20 12:08:20 -0700783 # * when it&#x27;s rotated 180 degrees around the top-left corner it becomes:
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700784 # 2----3
785 # | |
786 # 1----0
787 # and the vertex order will still be (0, 1, 2, 3).
Bu Sun Kim65020912020-05-20 12:08:20 -0700788 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700789 { # A vertex represents a 2D point in the image.
790 # NOTE: the normalized vertex coordinates are relative to the original image
791 # and range from 0 to 1.
Bu Sun Kim65020912020-05-20 12:08:20 -0700792 &quot;y&quot;: 3.14, # Y coordinate.
793 &quot;x&quot;: 3.14, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700794 },
795 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700796 &quot;vertices&quot;: [ # The bounding polygon vertices.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700797 { # A vertex represents a 2D point in the image.
798 # NOTE: the vertex coordinates are in the same scale as the original image.
Bu Sun Kim65020912020-05-20 12:08:20 -0700799 &quot;y&quot;: 42, # Y coordinate.
800 &quot;x&quot;: 42, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700801 },
802 ],
803 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700804 &quot;confidence&quot;: 3.14, # Confidence of the OCR results for the paragraph. Range [0, 1].
805 &quot;words&quot;: [ # List of all words in this paragraph.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700806 { # A word representation.
Bu Sun Kim65020912020-05-20 12:08:20 -0700807 &quot;boundingBox&quot;: { # A bounding polygon for the detected image annotation. # The bounding box for the word.
Dan O'Mearadd494642020-05-01 07:42:23 -0700808 # The vertices are in the order of top-left, top-right, bottom-right,
809 # bottom-left. When a rotation of the bounding box is detected the rotation
810 # is represented as around the top-left corner as defined when the text is
Bu Sun Kim65020912020-05-20 12:08:20 -0700811 # read in the &#x27;natural&#x27; orientation.
Dan O'Mearadd494642020-05-01 07:42:23 -0700812 # For example:
813 # * when the text is horizontal it might look like:
814 # 0----1
815 # | |
816 # 3----2
Bu Sun Kim65020912020-05-20 12:08:20 -0700817 # * when it&#x27;s rotated 180 degrees around the top-left corner it becomes:
Dan O'Mearadd494642020-05-01 07:42:23 -0700818 # 2----3
819 # | |
820 # 1----0
821 # and the vertex order will still be (0, 1, 2, 3).
Bu Sun Kim65020912020-05-20 12:08:20 -0700822 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
Dan O'Mearadd494642020-05-01 07:42:23 -0700823 { # A vertex represents a 2D point in the image.
824 # NOTE: the normalized vertex coordinates are relative to the original image
825 # and range from 0 to 1.
Bu Sun Kim65020912020-05-20 12:08:20 -0700826 &quot;y&quot;: 3.14, # Y coordinate.
827 &quot;x&quot;: 3.14, # X coordinate.
Dan O'Mearadd494642020-05-01 07:42:23 -0700828 },
829 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700830 &quot;vertices&quot;: [ # The bounding polygon vertices.
Dan O'Mearadd494642020-05-01 07:42:23 -0700831 { # A vertex represents a 2D point in the image.
832 # NOTE: the vertex coordinates are in the same scale as the original image.
Bu Sun Kim65020912020-05-20 12:08:20 -0700833 &quot;y&quot;: 42, # Y coordinate.
834 &quot;x&quot;: 42, # X coordinate.
Dan O'Mearadd494642020-05-01 07:42:23 -0700835 },
836 ],
837 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700838 &quot;confidence&quot;: 3.14, # Confidence of the OCR results for the word. Range [0, 1].
839 &quot;symbols&quot;: [ # List of symbols in the word.
840 # The order of the symbols follows the natural reading order.
841 { # A single symbol representation.
842 &quot;property&quot;: { # Additional information detected on the structural component. # Additional information detected for the symbol.
843 &quot;detectedLanguages&quot;: [ # A list of detected languages together with confidence.
844 { # Detected language for a structural component.
845 &quot;languageCode&quot;: &quot;A String&quot;, # The BCP-47 language code, such as &quot;en-US&quot; or &quot;sr-Latn&quot;. For more
846 # information, see
847 # http://www.unicode.org/reports/tr35/#Unicode_locale_identifier.
848 &quot;confidence&quot;: 3.14, # Confidence of detected language. Range [0, 1].
849 },
850 ],
851 &quot;detectedBreak&quot;: { # Detected start or end of a structural component. # Detected start or end of a text segment.
852 &quot;type&quot;: &quot;A String&quot;, # Detected break type.
853 &quot;isPrefix&quot;: True or False, # True if break prepends the element.
854 },
855 },
856 &quot;boundingBox&quot;: { # A bounding polygon for the detected image annotation. # The bounding box for the symbol.
857 # The vertices are in the order of top-left, top-right, bottom-right,
858 # bottom-left. When a rotation of the bounding box is detected the rotation
859 # is represented as around the top-left corner as defined when the text is
860 # read in the &#x27;natural&#x27; orientation.
861 # For example:
862 # * when the text is horizontal it might look like:
863 # 0----1
864 # | |
865 # 3----2
866 # * when it&#x27;s rotated 180 degrees around the top-left corner it becomes:
867 # 2----3
868 # | |
869 # 1----0
870 # and the vertex order will still be (0, 1, 2, 3).
871 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
872 { # A vertex represents a 2D point in the image.
873 # NOTE: the normalized vertex coordinates are relative to the original image
874 # and range from 0 to 1.
875 &quot;y&quot;: 3.14, # Y coordinate.
876 &quot;x&quot;: 3.14, # X coordinate.
877 },
878 ],
879 &quot;vertices&quot;: [ # The bounding polygon vertices.
880 { # A vertex represents a 2D point in the image.
881 # NOTE: the vertex coordinates are in the same scale as the original image.
882 &quot;y&quot;: 42, # Y coordinate.
883 &quot;x&quot;: 42, # X coordinate.
884 },
885 ],
886 },
887 &quot;confidence&quot;: 3.14, # Confidence of the OCR results for the symbol. Range [0, 1].
888 &quot;text&quot;: &quot;A String&quot;, # The actual UTF-8 representation of the symbol.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700889 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700890 ],
891 &quot;property&quot;: { # Additional information detected on the structural component. # Additional information detected for the word.
892 &quot;detectedLanguages&quot;: [ # A list of detected languages together with confidence.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700893 { # Detected language for a structural component.
Bu Sun Kim65020912020-05-20 12:08:20 -0700894 &quot;languageCode&quot;: &quot;A String&quot;, # The BCP-47 language code, such as &quot;en-US&quot; or &quot;sr-Latn&quot;. For more
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700895 # information, see
896 # http://www.unicode.org/reports/tr35/#Unicode_locale_identifier.
Bu Sun Kim65020912020-05-20 12:08:20 -0700897 &quot;confidence&quot;: 3.14, # Confidence of detected language. Range [0, 1].
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700898 },
899 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700900 &quot;detectedBreak&quot;: { # Detected start or end of a structural component. # Detected start or end of a text segment.
901 &quot;type&quot;: &quot;A String&quot;, # Detected break type.
902 &quot;isPrefix&quot;: True or False, # True if break prepends the element.
903 },
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700904 },
905 },
906 ],
907 },
908 ],
909 },
910 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700911 &quot;property&quot;: { # Additional information detected on the structural component. # Additional information detected on the page.
912 &quot;detectedLanguages&quot;: [ # A list of detected languages together with confidence.
913 { # Detected language for a structural component.
914 &quot;languageCode&quot;: &quot;A String&quot;, # The BCP-47 language code, such as &quot;en-US&quot; or &quot;sr-Latn&quot;. For more
915 # information, see
916 # http://www.unicode.org/reports/tr35/#Unicode_locale_identifier.
917 &quot;confidence&quot;: 3.14, # Confidence of detected language. Range [0, 1].
918 },
919 ],
920 &quot;detectedBreak&quot;: { # Detected start or end of a structural component. # Detected start or end of a text segment.
921 &quot;type&quot;: &quot;A String&quot;, # Detected break type.
922 &quot;isPrefix&quot;: True or False, # True if break prepends the element.
923 },
924 },
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700925 },
926 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700927 &quot;text&quot;: &quot;A String&quot;, # UTF-8 text detected on the pages.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700928 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700929 &quot;textAnnotations&quot;: [ # If present, text (OCR) detection has completed successfully.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700930 { # Set of detected entity features.
Bu Sun Kim65020912020-05-20 12:08:20 -0700931 &quot;description&quot;: &quot;A String&quot;, # Entity textual description, expressed in its `locale` language.
932 &quot;topicality&quot;: 3.14, # The relevancy of the ICA (Image Content Annotation) label to the
933 # image. For example, the relevancy of &quot;tower&quot; is likely higher to an image
934 # containing the detected &quot;Eiffel Tower&quot; than to an image containing a
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700935 # detected distant towering building, even though the confidence that
936 # there is a tower in each image may be the same. Range [0, 1].
Bu Sun Kim65020912020-05-20 12:08:20 -0700937 &quot;properties&quot;: [ # Some entities may have optional user-supplied `Property` (name/value)
938 # fields, such a score or string that qualifies the entity.
939 { # A `Property` consists of a user-supplied name/value pair.
940 &quot;uint64Value&quot;: &quot;A String&quot;, # Value of numeric properties.
941 &quot;name&quot;: &quot;A String&quot;, # Name of the property.
942 &quot;value&quot;: &quot;A String&quot;, # Value of the property.
943 },
944 ],
945 &quot;score&quot;: 3.14, # Overall score of the result. Range [0, 1].
946 &quot;locations&quot;: [ # The location information for the detected entity. Multiple
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700947 # `LocationInfo` elements can be present because one location may
948 # indicate the location of the scene in the image, and another location
949 # may indicate the location of the place where the image was taken.
950 # Location information is usually present for landmarks.
951 { # Detected entity location information.
Bu Sun Kim65020912020-05-20 12:08:20 -0700952 &quot;latLng&quot;: { # An object representing a latitude/longitude pair. This is expressed as a pair # lat/long location coordinates.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700953 # of doubles representing degrees latitude and degrees longitude. Unless
954 # specified otherwise, this must conform to the
Bu Sun Kim65020912020-05-20 12:08:20 -0700955 # &lt;a href=&quot;http://www.unoosa.org/pdf/icg/2012/template/WGS_84.pdf&quot;&gt;WGS84
Dan O'Mearadd494642020-05-01 07:42:23 -0700956 # standard&lt;/a&gt;. Values must be within normalized ranges.
Bu Sun Kim65020912020-05-20 12:08:20 -0700957 &quot;latitude&quot;: 3.14, # The latitude in degrees. It must be in the range [-90.0, +90.0].
958 &quot;longitude&quot;: 3.14, # The longitude in degrees. It must be in the range [-180.0, +180.0].
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700959 },
960 },
961 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700962 &quot;mid&quot;: &quot;A String&quot;, # Opaque entity ID. Some IDs may be available in
963 # [Google Knowledge Graph Search
964 # API](https://developers.google.com/knowledge-graph/).
965 &quot;confidence&quot;: 3.14, # **Deprecated. Use `score` instead.**
966 # The accuracy of the entity detection in an image.
967 # For example, for an image in which the &quot;Eiffel Tower&quot; entity is detected,
968 # this field represents the confidence that there is a tower in the query
969 # image. Range [0, 1].
970 &quot;boundingPoly&quot;: { # A bounding polygon for the detected image annotation. # Image region to which this entity belongs. Not produced
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700971 # for `LABEL_DETECTION` features.
Bu Sun Kim65020912020-05-20 12:08:20 -0700972 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700973 { # A vertex represents a 2D point in the image.
974 # NOTE: the normalized vertex coordinates are relative to the original image
975 # and range from 0 to 1.
Bu Sun Kim65020912020-05-20 12:08:20 -0700976 &quot;y&quot;: 3.14, # Y coordinate.
977 &quot;x&quot;: 3.14, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700978 },
979 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700980 &quot;vertices&quot;: [ # The bounding polygon vertices.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700981 { # A vertex represents a 2D point in the image.
982 # NOTE: the vertex coordinates are in the same scale as the original image.
Bu Sun Kim65020912020-05-20 12:08:20 -0700983 &quot;y&quot;: 42, # Y coordinate.
984 &quot;x&quot;: 42, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700985 },
986 ],
987 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700988 &quot;locale&quot;: &quot;A String&quot;, # The language code for the locale in which the entity textual
989 # `description` is expressed.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700990 },
991 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700992 &quot;imagePropertiesAnnotation&quot;: { # Stores image properties, such as dominant colors. # If present, image properties were extracted successfully.
993 &quot;dominantColors&quot;: { # Set of dominant colors and their corresponding scores. # If present, dominant colors completed successfully.
994 &quot;colors&quot;: [ # RGB color values with their score and pixel fraction.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700995 { # Color information consists of RGB channels, score, and the fraction of
996 # the image that the color occupies in the image.
Bu Sun Kim65020912020-05-20 12:08:20 -0700997 &quot;pixelFraction&quot;: 3.14, # The fraction of pixels the color occupies in the image.
998 # Value in range [0, 1].
999 &quot;color&quot;: { # Represents a color in the RGBA color space. This representation is designed # RGB components of the color.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001000 # for simplicity of conversion to/from color representations in various
1001 # languages over compactness; for example, the fields of this representation
Bu Sun Kim65020912020-05-20 12:08:20 -07001002 # can be trivially provided to the constructor of &quot;java.awt.Color&quot; in Java; it
1003 # can also be trivially provided to UIColor&#x27;s &quot;+colorWithRed:green:blue:alpha&quot;
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001004 # method in iOS; and, with just a little work, it can be easily formatted into
Bu Sun Kim65020912020-05-20 12:08:20 -07001005 # a CSS &quot;rgba()&quot; string in JavaScript, as well.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001006 #
1007 # Note: this proto does not carry information about the absolute color space
1008 # that should be used to interpret the RGB value (e.g. sRGB, Adobe RGB,
1009 # DCI-P3, BT.2020, etc.). By default, applications SHOULD assume the sRGB color
1010 # space.
1011 #
1012 # Example (Java):
1013 #
1014 # import com.google.type.Color;
1015 #
1016 # // ...
1017 # public static java.awt.Color fromProto(Color protocolor) {
1018 # float alpha = protocolor.hasAlpha()
1019 # ? protocolor.getAlpha().getValue()
1020 # : 1.0;
1021 #
1022 # return new java.awt.Color(
1023 # protocolor.getRed(),
1024 # protocolor.getGreen(),
1025 # protocolor.getBlue(),
1026 # alpha);
1027 # }
1028 #
1029 # public static Color toProto(java.awt.Color color) {
1030 # float red = (float) color.getRed();
1031 # float green = (float) color.getGreen();
1032 # float blue = (float) color.getBlue();
1033 # float denominator = 255.0;
1034 # Color.Builder resultBuilder =
1035 # Color
1036 # .newBuilder()
1037 # .setRed(red / denominator)
1038 # .setGreen(green / denominator)
1039 # .setBlue(blue / denominator);
1040 # int alpha = color.getAlpha();
1041 # if (alpha != 255) {
1042 # result.setAlpha(
1043 # FloatValue
1044 # .newBuilder()
1045 # .setValue(((float) alpha) / denominator)
1046 # .build());
1047 # }
1048 # return resultBuilder.build();
1049 # }
1050 # // ...
1051 #
1052 # Example (iOS / Obj-C):
1053 #
1054 # // ...
1055 # static UIColor* fromProto(Color* protocolor) {
1056 # float red = [protocolor red];
1057 # float green = [protocolor green];
1058 # float blue = [protocolor blue];
1059 # FloatValue* alpha_wrapper = [protocolor alpha];
1060 # float alpha = 1.0;
1061 # if (alpha_wrapper != nil) {
1062 # alpha = [alpha_wrapper value];
1063 # }
1064 # return [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
1065 # }
1066 #
1067 # static Color* toProto(UIColor* color) {
1068 # CGFloat red, green, blue, alpha;
Dan O'Mearadd494642020-05-01 07:42:23 -07001069 # if (![color getRed:&amp;red green:&amp;green blue:&amp;blue alpha:&amp;alpha]) {
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001070 # return nil;
1071 # }
1072 # Color* result = [[Color alloc] init];
1073 # [result setRed:red];
1074 # [result setGreen:green];
1075 # [result setBlue:blue];
Dan O'Mearadd494642020-05-01 07:42:23 -07001076 # if (alpha &lt;= 0.9999) {
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001077 # [result setAlpha:floatWrapperWithValue(alpha)];
1078 # }
1079 # [result autorelease];
1080 # return result;
1081 # }
1082 # // ...
1083 #
1084 # Example (JavaScript):
1085 #
1086 # // ...
1087 #
1088 # var protoToCssColor = function(rgb_color) {
1089 # var redFrac = rgb_color.red || 0.0;
1090 # var greenFrac = rgb_color.green || 0.0;
1091 # var blueFrac = rgb_color.blue || 0.0;
1092 # var red = Math.floor(redFrac * 255);
1093 # var green = Math.floor(greenFrac * 255);
1094 # var blue = Math.floor(blueFrac * 255);
1095 #
Bu Sun Kim65020912020-05-20 12:08:20 -07001096 # if (!(&#x27;alpha&#x27; in rgb_color)) {
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001097 # return rgbToCssColor_(red, green, blue);
1098 # }
1099 #
1100 # var alphaFrac = rgb_color.alpha.value || 0.0;
Bu Sun Kim65020912020-05-20 12:08:20 -07001101 # var rgbParams = [red, green, blue].join(&#x27;,&#x27;);
1102 # return [&#x27;rgba(&#x27;, rgbParams, &#x27;,&#x27;, alphaFrac, &#x27;)&#x27;].join(&#x27;&#x27;);
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001103 # };
1104 #
1105 # var rgbToCssColor_ = function(red, green, blue) {
Dan O'Mearadd494642020-05-01 07:42:23 -07001106 # var rgbNumber = new Number((red &lt;&lt; 16) | (green &lt;&lt; 8) | blue);
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001107 # var hexString = rgbNumber.toString(16);
1108 # var missingZeros = 6 - hexString.length;
Bu Sun Kim65020912020-05-20 12:08:20 -07001109 # var resultBuilder = [&#x27;#&#x27;];
Dan O'Mearadd494642020-05-01 07:42:23 -07001110 # for (var i = 0; i &lt; missingZeros; i++) {
Bu Sun Kim65020912020-05-20 12:08:20 -07001111 # resultBuilder.push(&#x27;0&#x27;);
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001112 # }
1113 # resultBuilder.push(hexString);
Bu Sun Kim65020912020-05-20 12:08:20 -07001114 # return resultBuilder.join(&#x27;&#x27;);
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001115 # };
1116 #
1117 # // ...
Bu Sun Kim65020912020-05-20 12:08:20 -07001118 &quot;red&quot;: 3.14, # The amount of red in the color as a value in the interval [0, 1].
1119 &quot;green&quot;: 3.14, # The amount of green in the color as a value in the interval [0, 1].
1120 &quot;blue&quot;: 3.14, # The amount of blue in the color as a value in the interval [0, 1].
1121 &quot;alpha&quot;: 3.14, # The fraction of this color that should be applied to the pixel. That is,
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001122 # the final pixel color is defined by the equation:
1123 #
1124 # pixel color = alpha * (this color) + (1.0 - alpha) * (background color)
1125 #
1126 # This means that a value of 1.0 corresponds to a solid color, whereas
1127 # a value of 0.0 corresponds to a completely transparent color. This
1128 # uses a wrapper message rather than a simple float scalar so that it is
1129 # possible to distinguish between a default value and the value being unset.
1130 # If omitted, this color object is to be rendered as a solid color
1131 # (as if the alpha value had been explicitly given with a value of 1.0).
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001132 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001133 &quot;score&quot;: 3.14, # Image-specific score for this color. Value in range [0, 1].
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001134 },
1135 ],
1136 },
1137 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001138 &quot;logoAnnotations&quot;: [ # If present, logo detection has completed successfully.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001139 { # Set of detected entity features.
Bu Sun Kim65020912020-05-20 12:08:20 -07001140 &quot;description&quot;: &quot;A String&quot;, # Entity textual description, expressed in its `locale` language.
1141 &quot;topicality&quot;: 3.14, # The relevancy of the ICA (Image Content Annotation) label to the
1142 # image. For example, the relevancy of &quot;tower&quot; is likely higher to an image
1143 # containing the detected &quot;Eiffel Tower&quot; than to an image containing a
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001144 # detected distant towering building, even though the confidence that
1145 # there is a tower in each image may be the same. Range [0, 1].
Bu Sun Kim65020912020-05-20 12:08:20 -07001146 &quot;properties&quot;: [ # Some entities may have optional user-supplied `Property` (name/value)
1147 # fields, such a score or string that qualifies the entity.
1148 { # A `Property` consists of a user-supplied name/value pair.
1149 &quot;uint64Value&quot;: &quot;A String&quot;, # Value of numeric properties.
1150 &quot;name&quot;: &quot;A String&quot;, # Name of the property.
1151 &quot;value&quot;: &quot;A String&quot;, # Value of the property.
1152 },
1153 ],
1154 &quot;score&quot;: 3.14, # Overall score of the result. Range [0, 1].
1155 &quot;locations&quot;: [ # The location information for the detected entity. Multiple
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001156 # `LocationInfo` elements can be present because one location may
1157 # indicate the location of the scene in the image, and another location
1158 # may indicate the location of the place where the image was taken.
1159 # Location information is usually present for landmarks.
1160 { # Detected entity location information.
Bu Sun Kim65020912020-05-20 12:08:20 -07001161 &quot;latLng&quot;: { # An object representing a latitude/longitude pair. This is expressed as a pair # lat/long location coordinates.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001162 # of doubles representing degrees latitude and degrees longitude. Unless
1163 # specified otherwise, this must conform to the
Bu Sun Kim65020912020-05-20 12:08:20 -07001164 # &lt;a href=&quot;http://www.unoosa.org/pdf/icg/2012/template/WGS_84.pdf&quot;&gt;WGS84
Dan O'Mearadd494642020-05-01 07:42:23 -07001165 # standard&lt;/a&gt;. Values must be within normalized ranges.
Bu Sun Kim65020912020-05-20 12:08:20 -07001166 &quot;latitude&quot;: 3.14, # The latitude in degrees. It must be in the range [-90.0, +90.0].
1167 &quot;longitude&quot;: 3.14, # The longitude in degrees. It must be in the range [-180.0, +180.0].
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001168 },
1169 },
1170 ],
Bu Sun Kim65020912020-05-20 12:08:20 -07001171 &quot;mid&quot;: &quot;A String&quot;, # Opaque entity ID. Some IDs may be available in
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001172 # [Google Knowledge Graph Search
1173 # API](https://developers.google.com/knowledge-graph/).
Bu Sun Kim65020912020-05-20 12:08:20 -07001174 &quot;confidence&quot;: 3.14, # **Deprecated. Use `score` instead.**
1175 # The accuracy of the entity detection in an image.
1176 # For example, for an image in which the &quot;Eiffel Tower&quot; entity is detected,
1177 # this field represents the confidence that there is a tower in the query
1178 # image. Range [0, 1].
1179 &quot;boundingPoly&quot;: { # A bounding polygon for the detected image annotation. # Image region to which this entity belongs. Not produced
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001180 # for `LABEL_DETECTION` features.
Bu Sun Kim65020912020-05-20 12:08:20 -07001181 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001182 { # A vertex represents a 2D point in the image.
1183 # NOTE: the normalized vertex coordinates are relative to the original image
1184 # and range from 0 to 1.
Bu Sun Kim65020912020-05-20 12:08:20 -07001185 &quot;y&quot;: 3.14, # Y coordinate.
1186 &quot;x&quot;: 3.14, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001187 },
1188 ],
Bu Sun Kim65020912020-05-20 12:08:20 -07001189 &quot;vertices&quot;: [ # The bounding polygon vertices.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001190 { # A vertex represents a 2D point in the image.
1191 # NOTE: the vertex coordinates are in the same scale as the original image.
Bu Sun Kim65020912020-05-20 12:08:20 -07001192 &quot;y&quot;: 42, # Y coordinate.
1193 &quot;x&quot;: 42, # X coordinate.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001194 },
1195 ],
1196 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001197 &quot;locale&quot;: &quot;A String&quot;, # The language code for the locale in which the entity textual
1198 # `description` is expressed.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001199 },
1200 ],
Bu Sun Kim65020912020-05-20 12:08:20 -07001201 &quot;context&quot;: { # If an image was produced from a file (e.g. a PDF), this message gives # If present, contextual information is needed to understand where this image
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001202 # comes from.
1203 # information about the source of that image.
Bu Sun Kim65020912020-05-20 12:08:20 -07001204 &quot;uri&quot;: &quot;A String&quot;, # The URI of the file used to produce the image.
1205 &quot;pageNumber&quot;: 42, # If the file was a PDF or TIFF, this field gives the page number within
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001206 # the file used to produce the image.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001207 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001208 &quot;webDetection&quot;: { # Relevant information for the image from the Internet. # If present, web detection has completed successfully.
1209 &quot;visuallySimilarImages&quot;: [ # The visually similar image results.
1210 { # Metadata for online images.
1211 &quot;score&quot;: 3.14, # (Deprecated) Overall relevancy score for the image.
1212 &quot;url&quot;: &quot;A String&quot;, # The result image URL.
1213 },
1214 ],
1215 &quot;bestGuessLabels&quot;: [ # The service&#x27;s best guess as to the topic of the request image.
1216 # Inferred from similar images on the open web.
1217 { # Label to provide extra metadata for the web detection.
1218 &quot;label&quot;: &quot;A String&quot;, # Label for extra metadata.
1219 &quot;languageCode&quot;: &quot;A String&quot;, # The BCP-47 language code for `label`, such as &quot;en-US&quot; or &quot;sr-Latn&quot;.
1220 # For more information, see
1221 # http://www.unicode.org/reports/tr35/#Unicode_locale_identifier.
1222 },
1223 ],
1224 &quot;fullMatchingImages&quot;: [ # Fully matching images from the Internet.
1225 # Can include resized copies of the query image.
1226 { # Metadata for online images.
1227 &quot;score&quot;: 3.14, # (Deprecated) Overall relevancy score for the image.
1228 &quot;url&quot;: &quot;A String&quot;, # The result image URL.
1229 },
1230 ],
1231 &quot;webEntities&quot;: [ # Deduced entities from similar images on the Internet.
1232 { # Entity deduced from similar images on the Internet.
1233 &quot;entityId&quot;: &quot;A String&quot;, # Opaque entity ID.
1234 &quot;description&quot;: &quot;A String&quot;, # Canonical description of the entity, in English.
1235 &quot;score&quot;: 3.14, # Overall relevancy score for the entity.
1236 # Not normalized and not comparable across different image queries.
1237 },
1238 ],
1239 &quot;pagesWithMatchingImages&quot;: [ # Web pages containing the matching images from the Internet.
1240 { # Metadata for web pages.
1241 &quot;pageTitle&quot;: &quot;A String&quot;, # Title for the web page, may contain HTML markups.
1242 &quot;fullMatchingImages&quot;: [ # Fully matching images on the page.
1243 # Can include resized copies of the query image.
1244 { # Metadata for online images.
1245 &quot;score&quot;: 3.14, # (Deprecated) Overall relevancy score for the image.
1246 &quot;url&quot;: &quot;A String&quot;, # The result image URL.
1247 },
1248 ],
1249 &quot;score&quot;: 3.14, # (Deprecated) Overall relevancy score for the web page.
1250 &quot;partialMatchingImages&quot;: [ # Partial matching images on the page.
1251 # Those images are similar enough to share some key-point features. For
1252 # example an original image will likely have partial matching for its
1253 # crops.
1254 { # Metadata for online images.
1255 &quot;score&quot;: 3.14, # (Deprecated) Overall relevancy score for the image.
1256 &quot;url&quot;: &quot;A String&quot;, # The result image URL.
1257 },
1258 ],
1259 &quot;url&quot;: &quot;A String&quot;, # The result web page URL.
1260 },
1261 ],
1262 &quot;partialMatchingImages&quot;: [ # Partial matching images from the Internet.
1263 # Those images are similar enough to share some key-point features. For
1264 # example an original image will likely have partial matching for its crops.
1265 { # Metadata for online images.
1266 &quot;score&quot;: 3.14, # (Deprecated) Overall relevancy score for the image.
1267 &quot;url&quot;: &quot;A String&quot;, # The result image URL.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001268 },
1269 ],
1270 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001271 &quot;safeSearchAnnotation&quot;: { # Set of features pertaining to the image, computed by computer vision # If present, safe-search annotation has completed successfully.
1272 # methods over safe-search verticals (for example, adult, spoof, medical,
1273 # violence).
1274 &quot;medical&quot;: &quot;A String&quot;, # Likelihood that this is a medical image.
1275 &quot;racy&quot;: &quot;A String&quot;, # Likelihood that the request image contains racy content. Racy content may
1276 # include (but is not limited to) skimpy or sheer clothing, strategically
1277 # covered nudity, lewd or provocative poses, or close-ups of sensitive
1278 # body areas.
1279 &quot;violence&quot;: &quot;A String&quot;, # Likelihood that this image contains violent content.
1280 &quot;adult&quot;: &quot;A String&quot;, # Represents the adult content likelihood for the image. Adult content may
1281 # contain elements such as nudity, pornographic images or cartoons, or
1282 # sexual activities.
1283 &quot;spoof&quot;: &quot;A String&quot;, # Spoof likelihood. The likelihood that an modification
1284 # was made to the image&#x27;s canonical version to make it appear
1285 # funny or offensive.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001286 },
1287 },
1288 ],
Bu Sun Kim65020912020-05-20 12:08:20 -07001289 &quot;inputConfig&quot;: { # The desired input location and metadata. # Information about the file for which this response is generated.
1290 &quot;content&quot;: &quot;A String&quot;, # File content, represented as a stream of bytes.
1291 # Note: As with all `bytes` fields, protobuffers use a pure binary
1292 # representation, whereas JSON representations use base64.
1293 #
1294 # Currently, this field only works for BatchAnnotateFiles requests. It does
1295 # not work for AsyncBatchAnnotateFiles requests.
1296 &quot;gcsSource&quot;: { # The Google Cloud Storage location where the input will be read from. # The Google Cloud Storage location to read the input from.
1297 &quot;uri&quot;: &quot;A String&quot;, # Google Cloud Storage URI for the input file. This must only be a
1298 # Google Cloud Storage object. Wildcards are not currently supported.
1299 },
1300 &quot;mimeType&quot;: &quot;A String&quot;, # The type of the file. Currently only &quot;application/pdf&quot;, &quot;image/tiff&quot; and
1301 # &quot;image/gif&quot; are supported. Wildcards are not supported.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001302 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001303 &quot;totalPages&quot;: 42, # This field gives the total number of pages in the file.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001304 },
1305 ],
1306 }</pre>
1307</div>
1308
1309<div class="method">
Dan O'Mearadd494642020-05-01 07:42:23 -07001310 <code class="details" id="asyncBatchAnnotate">asyncBatchAnnotate(body=None, x__xgafv=None)</code>
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001311 <pre>Run asynchronous image detection and annotation for a list of generic
1312files, such as PDF files, which may contain multiple pages and multiple
1313images per page. Progress and results can be retrieved through the
1314`google.longrunning.Operations` interface.
1315`Operation.metadata` contains `OperationMetadata` (metadata).
1316`Operation.response` contains `AsyncBatchAnnotateFilesResponse` (results).
1317
1318Args:
Dan O'Mearadd494642020-05-01 07:42:23 -07001319 body: object, The request body.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001320 The object takes the form of:
1321
1322{ # Multiple async file annotation requests are batched into a single service
1323 # call.
Bu Sun Kim65020912020-05-20 12:08:20 -07001324 &quot;requests&quot;: [ # Required. Individual async file annotation requests for this batch.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001325 { # An offline file annotation request.
Bu Sun Kim65020912020-05-20 12:08:20 -07001326 &quot;imageContext&quot;: { # Image context and/or feature-specific parameters. # Additional context that may accompany the image(s) in the file.
1327 &quot;languageHints&quot;: [ # List of languages to use for TEXT_DETECTION. In most cases, an empty value
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001328 # yields the best results since it enables automatic language detection. For
1329 # languages based on the Latin alphabet, setting `language_hints` is not
1330 # needed. In rare cases, when the language of the text in the image is known,
1331 # setting a hint will help get better results (although it will be a
1332 # significant hindrance if the hint is wrong). Text detection returns an
1333 # error if one or more of the specified languages is not one of the
Dan O'Mearadd494642020-05-01 07:42:23 -07001334 # [supported languages](https://cloud.google.com/vision/docs/languages).
Bu Sun Kim65020912020-05-20 12:08:20 -07001335 &quot;A String&quot;,
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001336 ],
Bu Sun Kim65020912020-05-20 12:08:20 -07001337 &quot;webDetectionParams&quot;: { # Parameters for web detection request. # Parameters for web detection.
1338 &quot;includeGeoResults&quot;: True or False, # Whether to include results derived from the geo information in the image.
1339 },
1340 &quot;latLongRect&quot;: { # Rectangle determined by min and max `LatLng` pairs. # Not used.
1341 &quot;minLatLng&quot;: { # An object representing a latitude/longitude pair. This is expressed as a pair # Min lat/long pair.
1342 # of doubles representing degrees latitude and degrees longitude. Unless
1343 # specified otherwise, this must conform to the
1344 # &lt;a href=&quot;http://www.unoosa.org/pdf/icg/2012/template/WGS_84.pdf&quot;&gt;WGS84
1345 # standard&lt;/a&gt;. Values must be within normalized ranges.
1346 &quot;latitude&quot;: 3.14, # The latitude in degrees. It must be in the range [-90.0, +90.0].
1347 &quot;longitude&quot;: 3.14, # The longitude in degrees. It must be in the range [-180.0, +180.0].
1348 },
1349 &quot;maxLatLng&quot;: { # An object representing a latitude/longitude pair. This is expressed as a pair # Max lat/long pair.
1350 # of doubles representing degrees latitude and degrees longitude. Unless
1351 # specified otherwise, this must conform to the
1352 # &lt;a href=&quot;http://www.unoosa.org/pdf/icg/2012/template/WGS_84.pdf&quot;&gt;WGS84
1353 # standard&lt;/a&gt;. Values must be within normalized ranges.
1354 &quot;latitude&quot;: 3.14, # The latitude in degrees. It must be in the range [-90.0, +90.0].
1355 &quot;longitude&quot;: 3.14, # The longitude in degrees. It must be in the range [-180.0, +180.0].
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001356 },
1357 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001358 &quot;cropHintsParams&quot;: { # Parameters for crop hints annotation request. # Parameters for crop hints annotation request.
1359 &quot;aspectRatios&quot;: [ # Aspect ratios in floats, representing the ratio of the width to the height
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001360 # of the image. For example, if the desired aspect ratio is 4/3, the
1361 # corresponding float value should be 1.33333. If not specified, the
1362 # best possible crop is returned. The number of provided aspect ratios is
1363 # limited to a maximum of 16; any aspect ratios provided after the 16th are
1364 # ignored.
1365 3.14,
1366 ],
1367 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001368 &quot;productSearchParams&quot;: { # Parameters for a product search request. # Parameters for product search.
1369 &quot;filter&quot;: &quot;A String&quot;, # The filtering expression. This can be used to restrict search results based
1370 # on Product labels. We currently support an AND of OR of key-value
1371 # expressions, where each expression within an OR must have the same key. An
1372 # &#x27;=&#x27; should be used to connect the key and value.
1373 #
1374 # For example, &quot;(color = red OR color = blue) AND brand = Google&quot; is
1375 # acceptable, but &quot;(color = red OR brand = Google)&quot; is not acceptable.
1376 # &quot;color: red&quot; is not acceptable because it uses a &#x27;:&#x27; instead of an &#x27;=&#x27;.
1377 &quot;productSet&quot;: &quot;A String&quot;, # The resource name of a ProductSet to be searched for similar images.
1378 #
1379 # Format is:
1380 # `projects/PROJECT_ID/locations/LOC_ID/productSets/PRODUCT_SET_ID`.
1381 &quot;boundingPoly&quot;: { # A bounding polygon for the detected image annotation. # The bounding polygon around the area of interest in the image.
1382 # If it is not specified, system discretion will be applied.
1383 &quot;normalizedVertices&quot;: [ # The bounding polygon normalized vertices.
1384 { # A vertex represents a 2D point in the image.
1385 # NOTE: the normalized vertex coordinates are relative to the original image
1386 # and range from 0 to 1.
1387 &quot;y&quot;: 3.14, # Y coordinate.
1388 &quot;x&quot;: 3.14, # X coordinate.
1389 },
1390 ],
1391 &quot;vertices&quot;: [ # The bounding polygon vertices.
1392 { # A vertex represents a 2D point in the image.
1393 # NOTE: the vertex coordinates are in the same scale as the original image.
1394 &quot;y&quot;: 42, # Y coordinate.
1395 &quot;x&quot;: 42, # X coordinate.
1396 },
1397 ],
1398 },
1399 &quot;productCategories&quot;: [ # The list of product categories to search in. Currently, we only consider
1400 # the first category, and either &quot;homegoods-v2&quot;, &quot;apparel-v2&quot;, &quot;toys-v2&quot;,
1401 # &quot;packagedgoods-v1&quot;, or &quot;general-v1&quot; should be specified. The legacy
1402 # categories &quot;homegoods&quot;, &quot;apparel&quot;, and &quot;toys&quot; are still supported but will
1403 # be deprecated. For new products, please use &quot;homegoods-v2&quot;, &quot;apparel-v2&quot;,
1404 # or &quot;toys-v2&quot; for better product search accuracy. It is recommended to
1405 # migrate existing products to these categories as well.
1406 &quot;A String&quot;,
1407 ],
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001408 },
1409 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001410 &quot;outputConfig&quot;: { # The desired output location and metadata. # Required. The desired output location and metadata (e.g. format).
1411 &quot;gcsDestination&quot;: { # The Google Cloud Storage location where the output will be written to. # The Google Cloud Storage location to write the output(s) to.
1412 &quot;uri&quot;: &quot;A String&quot;, # Google Cloud Storage URI prefix where the results will be stored. Results
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001413 # will be in JSON format and preceded by its corresponding input URI prefix.
1414 # This field can either represent a gcs file prefix or gcs directory. In
1415 # either case, the uri should be unique because in order to get all of the
1416 # output files, you will need to do a wildcard gcs search on the uri prefix
1417 # you provide.
1418 #
1419 # Examples:
1420 #
1421 # * File Prefix: gs://bucket-name/here/filenameprefix The output files
1422 # will be created in gs://bucket-name/here/ and the names of the
Bu Sun Kim65020912020-05-20 12:08:20 -07001423 # output files will begin with &quot;filenameprefix&quot;.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001424 #
1425 # * Directory Prefix: gs://bucket-name/some/location/ The output files
1426 # will be created in gs://bucket-name/some/location/ and the names of the
1427 # output files could be anything because there was no filename prefix
1428 # specified.
1429 #
1430 # If multiple outputs, each response is still AnnotateFileResponse, each of
1431 # which contains some subset of the full list of AnnotateImageResponse.
1432 # Multiple outputs can happen if, for example, the output JSON is too large
1433 # and overflows into multiple sharded files.
1434 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001435 &quot;batchSize&quot;: 42, # The max number of response protos to put into each output JSON file on
1436 # Google Cloud Storage.
1437 # The valid range is [1, 100]. If not specified, the default value is 20.
1438 #
1439 # For example, for one pdf file with 100 pages, 100 response protos will
1440 # be generated. If `batch_size` = 20, then 5 json files each
1441 # containing 20 response protos will be written under the prefix
1442 # `gcs_destination`.`uri`.
1443 #
1444 # Currently, batch_size only applies to GcsDestination, with potential future
1445 # support for other output configurations.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001446 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001447 &quot;inputConfig&quot;: { # The desired input location and metadata. # Required. Information about the input file.
1448 &quot;content&quot;: &quot;A String&quot;, # File content, represented as a stream of bytes.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001449 # Note: As with all `bytes` fields, protobuffers use a pure binary
1450 # representation, whereas JSON representations use base64.
1451 #
1452 # Currently, this field only works for BatchAnnotateFiles requests. It does
1453 # not work for AsyncBatchAnnotateFiles requests.
Bu Sun Kim65020912020-05-20 12:08:20 -07001454 &quot;gcsSource&quot;: { # The Google Cloud Storage location where the input will be read from. # The Google Cloud Storage location to read the input from.
1455 &quot;uri&quot;: &quot;A String&quot;, # Google Cloud Storage URI for the input file. This must only be a
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001456 # Google Cloud Storage object. Wildcards are not currently supported.
1457 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001458 &quot;mimeType&quot;: &quot;A String&quot;, # The type of the file. Currently only &quot;application/pdf&quot;, &quot;image/tiff&quot; and
1459 # &quot;image/gif&quot; are supported. Wildcards are not supported.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001460 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001461 &quot;features&quot;: [ # Required. Requested features.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001462 { # The type of Google Cloud Vision API detection to perform, and the maximum
1463 # number of results to return for that type. Multiple `Feature` objects can
1464 # be specified in the `features` list.
Bu Sun Kim65020912020-05-20 12:08:20 -07001465 &quot;type&quot;: &quot;A String&quot;, # The feature type.
1466 &quot;maxResults&quot;: 42, # Maximum number of results of this type. Does not apply to
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001467 # `TEXT_DETECTION`, `DOCUMENT_TEXT_DETECTION`, or `CROP_HINTS`.
Bu Sun Kim65020912020-05-20 12:08:20 -07001468 &quot;model&quot;: &quot;A String&quot;, # Model to use for the feature.
1469 # Supported values: &quot;builtin/stable&quot; (the default if unset) and
1470 # &quot;builtin/latest&quot;.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001471 },
1472 ],
1473 },
1474 ],
Bu Sun Kim65020912020-05-20 12:08:20 -07001475 &quot;parent&quot;: &quot;A String&quot;, # Optional. Target project and location to make a call.
Dan O'Mearadd494642020-05-01 07:42:23 -07001476 #
1477 # Format: `projects/{project-id}/locations/{location-id}`.
1478 #
1479 # If no parent is specified, a region will be chosen automatically.
1480 #
1481 # Supported location-ids:
1482 # `us`: USA country only,
1483 # `asia`: East asia areas, like Japan, Taiwan,
1484 # `eu`: The European Union.
1485 #
1486 # Example: `projects/project-A/locations/eu`.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001487 }
1488
1489 x__xgafv: string, V1 error format.
1490 Allowed values
1491 1 - v1 error format
1492 2 - v2 error format
1493
1494Returns:
1495 An object of the form:
1496
1497 { # This resource represents a long-running operation that is the result of a
1498 # network API call.
Bu Sun Kim65020912020-05-20 12:08:20 -07001499 &quot;error&quot;: { # The `Status` type defines a logical error model that is suitable for # The error result of the operation in case of failure or cancellation.
1500 # different programming environments, including REST APIs and RPC APIs. It is
1501 # used by [gRPC](https://github.com/grpc). Each `Status` message contains
1502 # three pieces of data: error code, error message, and error details.
1503 #
1504 # You can find out more about this error model and how to work with it in the
1505 # [API Design Guide](https://cloud.google.com/apis/design/errors).
1506 &quot;code&quot;: 42, # The status code, which should be an enum value of google.rpc.Code.
1507 &quot;message&quot;: &quot;A String&quot;, # A developer-facing error message, which should be in English. Any
1508 # user-facing error message should be localized and sent in the
1509 # google.rpc.Status.details field, or localized by the client.
1510 &quot;details&quot;: [ # A list of messages that carry the error details. There is a common set of
1511 # message types for APIs to use.
1512 {
1513 &quot;a_key&quot;: &quot;&quot;, # Properties of the object. Contains field @type with type URL.
1514 },
1515 ],
1516 },
1517 &quot;metadata&quot;: { # Service-specific metadata associated with the operation. It typically
1518 # contains progress information and common metadata such as create time.
1519 # Some services might not provide such metadata. Any method that returns a
1520 # long-running operation should document the metadata type, if any.
1521 &quot;a_key&quot;: &quot;&quot;, # Properties of the object. Contains field @type with type URL.
1522 },
1523 &quot;done&quot;: True or False, # If the value is `false`, it means the operation is still in progress.
1524 # If `true`, the operation is completed, and either `error` or `response` is
1525 # available.
1526 &quot;response&quot;: { # The normal response of the operation in case of success. If the original
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001527 # method returns no data on success, such as `Delete`, the response is
1528 # `google.protobuf.Empty`. If the original method is standard
1529 # `Get`/`Create`/`Update`, the response should be the resource. For other
1530 # methods, the response should have the type `XxxResponse`, where `Xxx`
1531 # is the original method name. For example, if the original method name
1532 # is `TakeSnapshot()`, the inferred response type is
1533 # `TakeSnapshotResponse`.
Bu Sun Kim65020912020-05-20 12:08:20 -07001534 &quot;a_key&quot;: &quot;&quot;, # Properties of the object. Contains field @type with type URL.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001535 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001536 &quot;name&quot;: &quot;A String&quot;, # The server-assigned name, which is only unique within the same service that
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001537 # originally returns it. If you use the default HTTP mapping, the
1538 # `name` should be a resource name ending with `operations/{unique_id}`.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001539 }</pre>
1540</div>
1541
1542</body></html>