blob: 77792ac2c1c7c6d4d271564f32732671845e3066 [file] [log] [blame]
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001<html><body>
2<style>
3
4body, h1, h2, h3, div, span, p, pre, a {
5 margin: 0;
6 padding: 0;
7 border: 0;
8 font-weight: inherit;
9 font-style: inherit;
10 font-size: 100%;
11 font-family: inherit;
12 vertical-align: baseline;
13}
14
15body {
16 font-size: 13px;
17 padding: 1em;
18}
19
20h1 {
21 font-size: 26px;
22 margin-bottom: 1em;
23}
24
25h2 {
26 font-size: 24px;
27 margin-bottom: 1em;
28}
29
30h3 {
31 font-size: 20px;
32 margin-bottom: 1em;
33 margin-top: 1em;
34}
35
36pre, code {
37 line-height: 1.5;
38 font-family: Monaco, 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', 'Lucida Console', monospace;
39}
40
41pre {
42 margin-top: 0.5em;
43}
44
45h1, h2, h3, p {
46 font-family: Arial, sans serif;
47}
48
49h1, h2, h3 {
50 border-bottom: solid #CCC 1px;
51}
52
53.toc_element {
54 margin-top: 0.5em;
55}
56
57.firstline {
58 margin-left: 2 em;
59}
60
61.method {
62 margin-top: 1em;
63 border: solid 1px #CCC;
64 padding: 1em;
65 background: #EEE;
66}
67
68.details {
69 font-weight: bold;
70 font-size: 14px;
71}
72
73</style>
74
75<h1><a href="dialogflow_v2.html">Dialogflow API</a> . <a href="dialogflow_v2.projects.html">projects</a> . <a href="dialogflow_v2.projects.agent.html">agent</a> . <a href="dialogflow_v2.projects.agent.sessions.html">sessions</a></h1>
76<h2>Instance Methods</h2>
77<p class="toc_element">
78 <code><a href="dialogflow_v2.projects.agent.sessions.contexts.html">contexts()</a></code>
79</p>
80<p class="firstline">Returns the contexts Resource.</p>
81
82<p class="toc_element">
83 <code><a href="dialogflow_v2.projects.agent.sessions.entityTypes.html">entityTypes()</a></code>
84</p>
85<p class="firstline">Returns the entityTypes Resource.</p>
86
87<p class="toc_element">
88 <code><a href="#deleteContexts">deleteContexts(parent, x__xgafv=None)</a></code></p>
89<p class="firstline">Deletes all active contexts in the specified session.</p>
90<p class="toc_element">
Dan O'Mearadd494642020-05-01 07:42:23 -070091 <code><a href="#detectIntent">detectIntent(session, body=None, x__xgafv=None)</a></code></p>
Bu Sun Kim715bd7f2019-06-14 16:50:42 -070092<p class="firstline">Processes a natural language query and returns structured, actionable data</p>
93<h3>Method Details</h3>
94<div class="method">
95 <code class="details" id="deleteContexts">deleteContexts(parent, x__xgafv=None)</code>
96 <pre>Deletes all active contexts in the specified session.
97
98Args:
99 parent: string, Required. The name of the session to delete all contexts from. Format:
Dan O'Mearadd494642020-05-01 07:42:23 -0700100`projects/&lt;Project ID&gt;/agent/sessions/&lt;Session ID&gt;` or `projects/&lt;Project
101ID&gt;/agent/environments/&lt;Environment ID&gt;/users/&lt;User ID&gt;/sessions/&lt;Session
102ID&gt;`.
Bu Sun Kim65020912020-05-20 12:08:20 -0700103If `Environment ID` is not specified we assume default &#x27;draft&#x27; environment.
104If `User ID` is not specified, we assume default &#x27;-&#x27; user. (required)
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700105 x__xgafv: string, V1 error format.
106 Allowed values
107 1 - v1 error format
108 2 - v2 error format
109
110Returns:
111 An object of the form:
112
113 { # A generic empty message that you can re-use to avoid defining duplicated
114 # empty messages in your APIs. A typical example is to use it as the request
115 # or the response type of an API method. For instance:
116 #
117 # service Foo {
118 # rpc Bar(google.protobuf.Empty) returns (google.protobuf.Empty);
119 # }
120 #
121 # The JSON representation for `Empty` is empty JSON object `{}`.
122 }</pre>
123</div>
124
125<div class="method">
Dan O'Mearadd494642020-05-01 07:42:23 -0700126 <code class="details" id="detectIntent">detectIntent(session, body=None, x__xgafv=None)</code>
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700127 <pre>Processes a natural language query and returns structured, actionable data
128as a result. This method is not idempotent, because it may cause contexts
129and session entity types to be updated, which in turn might affect
130results of future queries.
131
132Args:
133 session: string, Required. The name of the session this query is sent to. Format:
Dan O'Mearadd494642020-05-01 07:42:23 -0700134`projects/&lt;Project ID&gt;/agent/sessions/&lt;Session ID&gt;`, or
135`projects/&lt;Project ID&gt;/agent/environments/&lt;Environment ID&gt;/users/&lt;User
136ID&gt;/sessions/&lt;Session ID&gt;`. If `Environment ID` is not specified, we assume
Bu Sun Kim65020912020-05-20 12:08:20 -0700137default &#x27;draft&#x27; environment. If `User ID` is not specified, we are using
138&quot;-&quot;. It&#x27;s up to the API caller to choose an appropriate `Session ID` and
Dan O'Mearadd494642020-05-01 07:42:23 -0700139`User Id`. They can be a random number or some type of user and session
140identifiers (preferably hashed). The length of the `Session ID` and
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700141`User ID` must not exceed 36 characters.
142
143For more information, see the [API interactions
144guide](https://cloud.google.com/dialogflow/docs/api-overview). (required)
Dan O'Mearadd494642020-05-01 07:42:23 -0700145 body: object, The request body.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700146 The object takes the form of:
147
Bu Sun Kim65020912020-05-20 12:08:20 -0700148{ # The request to detect user&#x27;s intent.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700149 &quot;outputAudioConfigMask&quot;: &quot;A String&quot;, # Mask for output_audio_config indicating which settings in this
150 # request-level config should override speech synthesizer settings defined at
151 # agent-level.
152 #
153 # If unspecified or empty, output_audio_config replaces the agent-level
154 # config in its entirety.
155 &quot;queryParams&quot;: { # Represents the parameters of the conversational query. # The parameters of this query.
156 &quot;resetContexts&quot;: True or False, # Specifies whether to delete all contexts in the current session
157 # before the new ones are activated.
158 &quot;sentimentAnalysisRequestConfig&quot;: { # Configures the types of sentiment analysis to perform. # Configures the type of sentiment analysis to perform. If not
159 # provided, sentiment analysis is not performed.
160 &quot;analyzeQueryTextSentiment&quot;: True or False, # Instructs the service to perform sentiment analysis on
161 # `query_text`. If not provided, sentiment analysis is not performed on
162 # `query_text`.
163 },
164 &quot;sessionEntityTypes&quot;: [ # Additional session entity types to replace or extend developer
165 # entity types with. The entity synonyms apply to all languages and persist
166 # for the session of this query.
167 { # A session represents a conversation between a Dialogflow agent and an
168 # end-user. You can create special entities, called session entities, during a
169 # session. Session entities can extend or replace custom entity types and only
170 # exist during the session that they were created for. All session data,
171 # including session entities, is stored by Dialogflow for 20 minutes.
172 #
173 # For more information, see the [session entity
174 # guide](https://cloud.google.com/dialogflow/docs/entities-session).
175 &quot;name&quot;: &quot;A String&quot;, # Required. The unique identifier of this session entity type. Format:
176 # `projects/&lt;Project ID&gt;/agent/sessions/&lt;Session ID&gt;/entityTypes/&lt;Entity Type
177 # Display Name&gt;`, or `projects/&lt;Project ID&gt;/agent/environments/&lt;Environment
178 # ID&gt;/users/&lt;User ID&gt;/sessions/&lt;Session ID&gt;/entityTypes/&lt;Entity Type Display
179 # Name&gt;`.
180 # If `Environment ID` is not specified, we assume default &#x27;draft&#x27;
181 # environment. If `User ID` is not specified, we assume default &#x27;-&#x27; user.
182 #
183 # `&lt;Entity Type Display Name&gt;` must be the display name of an existing entity
184 # type in the same agent that will be overridden or supplemented.
185 &quot;entities&quot;: [ # Required. The collection of entities associated with this session entity
186 # type.
187 { # An **entity entry** for an associated entity type.
188 &quot;synonyms&quot;: [ # Required. A collection of value synonyms. For example, if the entity type
189 # is *vegetable*, and `value` is *scallions*, a synonym could be *green
190 # onions*.
191 #
192 # For `KIND_LIST` entity types:
193 #
194 # * This collection must contain exactly one synonym equal to `value`.
195 &quot;A String&quot;,
196 ],
197 &quot;value&quot;: &quot;A String&quot;, # Required. The primary value associated with this entity entry.
198 # For example, if the entity type is *vegetable*, the value could be
199 # *scallions*.
200 #
201 # For `KIND_MAP` entity types:
202 #
203 # * A reference value to be used in place of synonyms.
204 #
205 # For `KIND_LIST` entity types:
206 #
207 # * A string that can contain references to other entity types (with or
208 # without aliases).
209 },
210 ],
211 &quot;entityOverrideMode&quot;: &quot;A String&quot;, # Required. Indicates whether the additional data should override or
212 # supplement the custom entity type definition.
213 },
214 ],
215 &quot;payload&quot;: { # This field can be used to pass custom data to your webhook.
216 # Arbitrary JSON objects are supported.
217 # If supplied, the value is used to populate the
218 # `WebhookRequest.original_detect_intent_request.payload`
219 # field sent to your webhook.
220 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
221 },
222 &quot;geoLocation&quot;: { # An object representing a latitude/longitude pair. This is expressed as a pair # The geo location of this conversational query.
223 # of doubles representing degrees latitude and degrees longitude. Unless
224 # specified otherwise, this must conform to the
225 # &lt;a href=&quot;http://www.unoosa.org/pdf/icg/2012/template/WGS_84.pdf&quot;&gt;WGS84
226 # standard&lt;/a&gt;. Values must be within normalized ranges.
227 &quot;latitude&quot;: 3.14, # The latitude in degrees. It must be in the range [-90.0, +90.0].
228 &quot;longitude&quot;: 3.14, # The longitude in degrees. It must be in the range [-180.0, +180.0].
229 },
230 &quot;contexts&quot;: [ # The collection of contexts to be activated before this query is
231 # executed.
232 { # Dialogflow contexts are similar to natural language context. If a person says
233 # to you &quot;they are orange&quot;, you need context in order to understand what &quot;they&quot;
234 # is referring to. Similarly, for Dialogflow to handle an end-user expression
235 # like that, it needs to be provided with context in order to correctly match
236 # an intent.
237 #
238 # Using contexts, you can control the flow of a conversation. You can configure
239 # contexts for an intent by setting input and output contexts, which are
240 # identified by string names. When an intent is matched, any configured output
241 # contexts for that intent become active. While any contexts are active,
242 # Dialogflow is more likely to match intents that are configured with input
243 # contexts that correspond to the currently active contexts.
244 #
245 # For more information about context, see the
246 # [Contexts guide](https://cloud.google.com/dialogflow/docs/contexts-overview).
247 &quot;parameters&quot;: { # Optional. The collection of parameters associated with this context.
248 #
249 # Depending on your protocol or client library language, this is a
250 # map, associative array, symbol table, dictionary, or JSON object
251 # composed of a collection of (MapKey, MapValue) pairs:
252 #
253 # - MapKey type: string
254 # - MapKey value: parameter name
255 # - MapValue type:
256 # - If parameter&#x27;s entity type is a composite entity: map
257 # - Else: string or number, depending on parameter value type
258 # - MapValue value:
259 # - If parameter&#x27;s entity type is a composite entity:
260 # map from composite entity property names to property values
261 # - Else: parameter value
262 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
263 },
264 &quot;name&quot;: &quot;A String&quot;, # Required. The unique identifier of the context. Format:
265 # `projects/&lt;Project ID&gt;/agent/sessions/&lt;Session ID&gt;/contexts/&lt;Context ID&gt;`,
266 # or `projects/&lt;Project ID&gt;/agent/environments/&lt;Environment ID&gt;/users/&lt;User
267 # ID&gt;/sessions/&lt;Session ID&gt;/contexts/&lt;Context ID&gt;`.
268 #
269 # The `Context ID` is always converted to lowercase, may only contain
270 # characters in a-zA-Z0-9_-% and may be at most 250 bytes long.
271 #
272 # If `Environment ID` is not specified, we assume default &#x27;draft&#x27;
273 # environment. If `User ID` is not specified, we assume default &#x27;-&#x27; user.
274 #
275 # The following context names are reserved for internal use by Dialogflow.
276 # You should not use these contexts or create contexts with these names:
277 #
278 # * `__system_counters__`
279 # * `*_id_dialog_context`
280 # * `*_dialog_params_size`
281 &quot;lifespanCount&quot;: 42, # Optional. The number of conversational query requests after which the
282 # context expires. The default is `0`. If set to `0`, the context expires
283 # immediately. Contexts expire automatically after 20 minutes if there
284 # are no matching queries.
285 },
286 ],
287 &quot;timeZone&quot;: &quot;A String&quot;, # The time zone of this conversational query from the
288 # [time zone database](https://www.iana.org/time-zones), e.g.,
289 # America/New_York, Europe/Paris. If not provided, the time zone specified in
290 # agent settings is used.
291 },
292 &quot;inputAudio&quot;: &quot;A String&quot;, # The natural language speech audio to be processed. This field
293 # should be populated iff `query_input` is set to an input audio config.
294 # A single request can contain up to 1 minute of speech audio data.
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700295 &quot;outputAudioConfig&quot;: { # Instructs the speech synthesizer on how to generate the output audio content. # Instructs the speech synthesizer how to generate the output
296 # audio. If this field is not set and agent-level speech synthesizer is not
297 # configured, no output audio is generated.
298 # If this audio config is supplied in a request, it overrides all existing
299 # text-to-speech settings applied to the agent.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700300 &quot;sampleRateHertz&quot;: 42, # The synthesis sample rate (in hertz) for this audio. If not
301 # provided, then the synthesizer will use the default sample rate based on
302 # the audio encoding. If this is different from the voice&#x27;s natural sample
303 # rate, then the synthesizer will honor this request by converting to the
304 # desired sample rate (which might result in worse audio quality).
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700305 &quot;audioEncoding&quot;: &quot;A String&quot;, # Required. Audio encoding of the synthesized audio content.
306 &quot;synthesizeSpeechConfig&quot;: { # Configuration of how speech should be synthesized. # Configuration of how speech should be synthesized.
307 &quot;volumeGainDb&quot;: 3.14, # Optional. Volume gain (in dB) of the normal native volume supported by the
308 # specific voice, in the range [-96.0, 16.0]. If unset, or set to a value of
309 # 0.0 (dB), will play at normal native signal amplitude. A value of -6.0 (dB)
310 # will play at approximately half the amplitude of the normal native signal
311 # amplitude. A value of +6.0 (dB) will play at approximately twice the
312 # amplitude of the normal native signal amplitude. We strongly recommend not
313 # to exceed +10 (dB) as there&#x27;s usually no effective increase in loudness for
314 # any value greater than that.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700315 &quot;speakingRate&quot;: 3.14, # Optional. Speaking rate/speed, in the range [0.25, 4.0]. 1.0 is the normal
316 # native speed supported by the specific voice. 2.0 is twice as fast, and
317 # 0.5 is half as fast. If unset(0.0), defaults to the native 1.0 speed. Any
318 # other values &lt; 0.25 or &gt; 4.0 will return an error.
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700319 &quot;pitch&quot;: 3.14, # Optional. Speaking pitch, in the range [-20.0, 20.0]. 20 means increase 20
320 # semitones from the original pitch. -20 means decrease 20 semitones from the
321 # original pitch.
322 &quot;voice&quot;: { # Description of which voice to use for speech synthesis. # Optional. The desired voice of the synthesized audio.
323 &quot;name&quot;: &quot;A String&quot;, # Optional. The name of the voice. If not set, the service will choose a
324 # voice based on the other parameters such as language_code and
325 # ssml_gender.
326 &quot;ssmlGender&quot;: &quot;A String&quot;, # Optional. The preferred gender of the voice. If not set, the service will
327 # choose a voice based on the other parameters such as language_code and
328 # name. Note that this is only a preference, not requirement. If a
329 # voice of the appropriate gender is not available, the synthesizer should
330 # substitute a voice with a different gender rather than failing the request.
331 },
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700332 &quot;effectsProfileId&quot;: [ # Optional. An identifier which selects &#x27;audio effects&#x27; profiles that are
333 # applied on (post synthesized) text to speech. Effects are applied on top of
334 # each other in the order they are given.
335 &quot;A String&quot;,
336 ],
337 },
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700338 },
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700339 &quot;queryInput&quot;: { # Represents the query input. It can contain either: # Required. The input specification. It can be set to:
340 #
341 # 1. an audio config
342 # which instructs the speech recognizer how to process the speech audio,
343 #
344 # 2. a conversational query in the form of text, or
345 #
346 # 3. an event that specifies which intent to trigger.
347 #
348 # 1. An audio config which
349 # instructs the speech recognizer how to process the speech audio.
350 #
351 # 2. A conversational query in the form of text,.
352 #
353 # 3. An event that specifies which intent to trigger.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700354 &quot;text&quot;: { # Represents the natural language text to be processed. # The natural language text to be processed.
355 &quot;text&quot;: &quot;A String&quot;, # Required. The UTF-8 encoded natural language text to be processed.
356 # Text length must not exceed 256 characters.
357 &quot;languageCode&quot;: &quot;A String&quot;, # Required. The language of this conversational query. See [Language
358 # Support](https://cloud.google.com/dialogflow/docs/reference/language)
359 # for a list of the currently supported language codes. Note that queries in
360 # the same session do not necessarily need to specify the same language.
361 },
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700362 &quot;event&quot;: { # Events allow for matching intents by event name instead of the natural # The event to be processed.
363 # language input. For instance, input `&lt;event: { name: &quot;welcome_event&quot;,
364 # parameters: { name: &quot;Sam&quot; } }&gt;` can trigger a personalized welcome response.
365 # The parameter `name` may be used by the agent in the response:
366 # `&quot;Hello #welcome_event.name! What can I do for you today?&quot;`.
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700367 &quot;parameters&quot;: { # The collection of parameters associated with the event.
368 #
369 # Depending on your protocol or client library language, this is a
370 # map, associative array, symbol table, dictionary, or JSON object
371 # composed of a collection of (MapKey, MapValue) pairs:
372 #
373 # - MapKey type: string
374 # - MapKey value: parameter name
375 # - MapValue type:
376 # - If parameter&#x27;s entity type is a composite entity: map
377 # - Else: string or number, depending on parameter value type
378 # - MapValue value:
379 # - If parameter&#x27;s entity type is a composite entity:
380 # map from composite entity property names to property values
381 # - Else: parameter value
382 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
383 },
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700384 &quot;name&quot;: &quot;A String&quot;, # Required. The unique identifier of the event.
385 &quot;languageCode&quot;: &quot;A String&quot;, # Required. The language of this query. See [Language
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700386 # Support](https://cloud.google.com/dialogflow/docs/reference/language)
387 # for a list of the currently supported language codes. Note that queries in
388 # the same session do not necessarily need to specify the same language.
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700389 },
390 &quot;audioConfig&quot;: { # Instructs the speech recognizer how to process the audio content. # Instructs the speech recognizer how to process the speech audio.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700391 &quot;model&quot;: &quot;A String&quot;, # Which Speech model to select for the given request. Select the
392 # model best suited to your domain to get best results. If a model is not
393 # explicitly specified, then we auto-select a model based on the parameters
394 # in the InputAudioConfig.
395 # If enhanced speech model is enabled for the agent and an enhanced
396 # version of the specified model for the language does not exist, then the
397 # speech is recognized using the standard version of the specified model.
398 # Refer to
399 # [Cloud Speech API
400 # documentation](https://cloud.google.com/speech-to-text/docs/basics#select-model)
401 # for more details.
402 &quot;modelVariant&quot;: &quot;A String&quot;, # Which variant of the Speech model to use.
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700403 &quot;speechContexts&quot;: [ # Context information to assist speech recognition.
404 #
405 # See [the Cloud Speech
406 # documentation](https://cloud.google.com/speech-to-text/docs/basics#phrase-hints)
407 # for more details.
408 { # Hints for the speech recognizer to help with recognition in a specific
409 # conversation state.
410 &quot;phrases&quot;: [ # Optional. A list of strings containing words and phrases that the speech
411 # recognizer should recognize with higher likelihood.
412 #
413 # This list can be used to:
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700414 #
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700415 # * improve accuracy for words and phrases you expect the user to say,
416 # e.g. typical commands for your Dialogflow agent
417 # * add additional words to the speech recognizer vocabulary
418 # * ...
419 #
420 # See the [Cloud Speech
421 # documentation](https://cloud.google.com/speech-to-text/quotas) for usage
422 # limits.
423 &quot;A String&quot;,
424 ],
425 &quot;boost&quot;: 3.14, # Optional. Boost for this context compared to other contexts:
426 #
427 # * If the boost is positive, Dialogflow will increase the probability that
428 # the phrases in this context are recognized over similar sounding phrases.
429 # * If the boost is unspecified or non-positive, Dialogflow will not apply
430 # any boost.
431 #
432 # Dialogflow recommends that you use boosts in the range (0, 20] and that you
433 # find a value that fits your use case with binary search.
434 },
435 ],
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700436 &quot;enableWordInfo&quot;: True or False, # If `true`, Dialogflow returns SpeechWordInfo in
437 # StreamingRecognitionResult with information about the recognized speech
438 # words, e.g. start and end time offsets. If false or unspecified, Speech
439 # doesn&#x27;t return any word-level information.
440 &quot;singleUtterance&quot;: True or False, # If `false` (default), recognition does not cease until the
441 # client closes the stream.
442 # If `true`, the recognizer will detect a single spoken utterance in input
443 # audio. Recognition ceases when it detects the audio&#x27;s voice has
444 # stopped or paused. In this case, once a detected intent is received, the
445 # client should close the stream and start a new request with a new stream as
446 # needed.
447 # Note: This setting is relevant only for streaming methods.
448 # Note: When specified, InputAudioConfig.single_utterance takes precedence
449 # over StreamingDetectIntentRequest.single_utterance.
450 &quot;audioEncoding&quot;: &quot;A String&quot;, # Required. Audio encoding of the audio content to process.
451 &quot;sampleRateHertz&quot;: 42, # Required. Sample rate (in Hertz) of the audio content sent in the query.
452 # Refer to
453 # [Cloud Speech API
454 # documentation](https://cloud.google.com/speech-to-text/docs/basics) for
455 # more details.
456 &quot;languageCode&quot;: &quot;A String&quot;, # Required. The language of the supplied audio. Dialogflow does not do
457 # translations. See [Language
458 # Support](https://cloud.google.com/dialogflow/docs/reference/language)
459 # for a list of the currently supported language codes. Note that queries in
460 # the same session do not necessarily need to specify the same language.
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700461 &quot;phraseHints&quot;: [ # A list of strings containing words and phrases that the speech
462 # recognizer should recognize with higher likelihood.
463 #
464 # See [the Cloud Speech
465 # documentation](https://cloud.google.com/speech-to-text/docs/basics#phrase-hints)
466 # for more details.
467 #
468 # This field is deprecated. Please use [speech_contexts]() instead. If you
469 # specify both [phrase_hints]() and [speech_contexts](), Dialogflow will
470 # treat the [phrase_hints]() as a single additional [SpeechContext]().
471 &quot;A String&quot;,
472 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700473 },
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700474 },
475 }
476
477 x__xgafv: string, V1 error format.
478 Allowed values
479 1 - v1 error format
480 2 - v2 error format
481
482Returns:
483 An object of the form:
484
485 { # The message returned from the DetectIntent method.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700486 &quot;outputAudio&quot;: &quot;A String&quot;, # The audio data bytes encoded as specified in the request.
487 # Note: The output audio is generated based on the values of default platform
488 # text responses found in the `query_result.fulfillment_messages` field. If
489 # multiple default text responses exist, they will be concatenated when
490 # generating audio. If no default platform text responses exist, the
491 # generated audio content will be empty.
492 #
493 # In some scenarios, multiple output audio fields may be present in the
494 # response structure. In these cases, only the top-most-level audio output
495 # has content.
496 &quot;responseId&quot;: &quot;A String&quot;, # The unique identifier of the response. It can be used to
497 # locate a response in the training example set or for reporting issues.
Bu Sun Kim65020912020-05-20 12:08:20 -0700498 &quot;outputAudioConfig&quot;: { # Instructs the speech synthesizer on how to generate the output audio content. # The config used by the speech synthesizer to generate the output audio.
Dan O'Mearadd494642020-05-01 07:42:23 -0700499 # If this audio config is supplied in a request, it overrides all existing
500 # text-to-speech settings applied to the agent.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700501 &quot;sampleRateHertz&quot;: 42, # The synthesis sample rate (in hertz) for this audio. If not
502 # provided, then the synthesizer will use the default sample rate based on
503 # the audio encoding. If this is different from the voice&#x27;s natural sample
504 # rate, then the synthesizer will honor this request by converting to the
505 # desired sample rate (which might result in worse audio quality).
Bu Sun Kim65020912020-05-20 12:08:20 -0700506 &quot;audioEncoding&quot;: &quot;A String&quot;, # Required. Audio encoding of the synthesized audio content.
507 &quot;synthesizeSpeechConfig&quot;: { # Configuration of how speech should be synthesized. # Configuration of how speech should be synthesized.
Bu Sun Kim65020912020-05-20 12:08:20 -0700508 &quot;volumeGainDb&quot;: 3.14, # Optional. Volume gain (in dB) of the normal native volume supported by the
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700509 # specific voice, in the range [-96.0, 16.0]. If unset, or set to a value of
510 # 0.0 (dB), will play at normal native signal amplitude. A value of -6.0 (dB)
511 # will play at approximately half the amplitude of the normal native signal
512 # amplitude. A value of +6.0 (dB) will play at approximately twice the
513 # amplitude of the normal native signal amplitude. We strongly recommend not
Bu Sun Kim65020912020-05-20 12:08:20 -0700514 # to exceed +10 (dB) as there&#x27;s usually no effective increase in loudness for
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700515 # any value greater than that.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700516 &quot;speakingRate&quot;: 3.14, # Optional. Speaking rate/speed, in the range [0.25, 4.0]. 1.0 is the normal
517 # native speed supported by the specific voice. 2.0 is twice as fast, and
518 # 0.5 is half as fast. If unset(0.0), defaults to the native 1.0 speed. Any
519 # other values &lt; 0.25 or &gt; 4.0 will return an error.
Bu Sun Kim65020912020-05-20 12:08:20 -0700520 &quot;pitch&quot;: 3.14, # Optional. Speaking pitch, in the range [-20.0, 20.0]. 20 means increase 20
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700521 # semitones from the original pitch. -20 means decrease 20 semitones from the
522 # original pitch.
Bu Sun Kim65020912020-05-20 12:08:20 -0700523 &quot;voice&quot;: { # Description of which voice to use for speech synthesis. # Optional. The desired voice of the synthesized audio.
524 &quot;name&quot;: &quot;A String&quot;, # Optional. The name of the voice. If not set, the service will choose a
525 # voice based on the other parameters such as language_code and
526 # ssml_gender.
527 &quot;ssmlGender&quot;: &quot;A String&quot;, # Optional. The preferred gender of the voice. If not set, the service will
528 # choose a voice based on the other parameters such as language_code and
529 # name. Note that this is only a preference, not requirement. If a
530 # voice of the appropriate gender is not available, the synthesizer should
531 # substitute a voice with a different gender rather than failing the request.
532 },
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700533 &quot;effectsProfileId&quot;: [ # Optional. An identifier which selects &#x27;audio effects&#x27; profiles that are
534 # applied on (post synthesized) text to speech. Effects are applied on top of
535 # each other in the order they are given.
536 &quot;A String&quot;,
537 ],
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700538 },
539 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700540 &quot;queryResult&quot;: { # Represents the result of conversational query or event processing. # The selected results of the conversational query or event processing.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700541 # See `alternative_query_results` for additional potential results.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700542 &quot;intent&quot;: { # An intent categorizes an end-user&#x27;s intention for one conversation turn. For # The intent that matched the conversational query. Some, not
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700543 # all fields are filled in this message, including but not limited to:
Dan O'Mearadd494642020-05-01 07:42:23 -0700544 # `name`, `display_name`, `end_interaction` and `is_fallback`.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700545 # each agent, you define many intents, where your combined intents can handle a
546 # complete conversation. When an end-user writes or says something, referred to
547 # as an end-user expression or end-user input, Dialogflow matches the end-user
548 # input to the best intent in your agent. Matching an intent is also known as
549 # intent classification.
550 #
551 # For more information, see the [intent
552 # guide](https://cloud.google.com/dialogflow/docs/intents-overview).
553 &quot;name&quot;: &quot;A String&quot;, # Optional. The unique identifier of this intent.
554 # Required for Intents.UpdateIntent and Intents.BatchUpdateIntents
555 # methods.
556 # Format: `projects/&lt;Project ID&gt;/agent/intents/&lt;Intent ID&gt;`.
557 &quot;webhookState&quot;: &quot;A String&quot;, # Optional. Indicates whether webhooks are enabled for the intent.
558 &quot;isFallback&quot;: True or False, # Optional. Indicates whether this is a fallback intent.
559 &quot;displayName&quot;: &quot;A String&quot;, # Required. The name of this intent.
560 &quot;messages&quot;: [ # Optional. The collection of rich messages corresponding to the
561 # `Response` field in the Dialogflow console.
562 { # A rich response message.
563 # Corresponds to the intent `Response` field in the Dialogflow console.
564 # For more information, see
565 # [Rich response
566 # messages](https://cloud.google.com/dialogflow/docs/intents-rich-messages).
567 &quot;card&quot;: { # The card response message. # The card response.
568 &quot;title&quot;: &quot;A String&quot;, # Optional. The title of the card.
569 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. The subtitle of the card.
570 &quot;buttons&quot;: [ # Optional. The collection of card buttons.
571 { # Contains information about a button.
572 &quot;text&quot;: &quot;A String&quot;, # Optional. The text to show on the button.
573 &quot;postback&quot;: &quot;A String&quot;, # Optional. The text to send back to the Dialogflow API or a URI to
574 # open.
575 },
576 ],
577 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file for the card.
578 },
579 &quot;text&quot;: { # The text response message. # The text response.
580 &quot;text&quot;: [ # Optional. The collection of the agent&#x27;s responses.
581 &quot;A String&quot;,
582 ],
583 },
584 &quot;carouselSelect&quot;: { # The card for presenting a carousel of options to select from. # The carousel card response for Actions on Google.
585 &quot;items&quot;: [ # Required. Carousel items.
586 { # An item in the carousel.
587 &quot;description&quot;: &quot;A String&quot;, # Optional. The body text of the card.
588 &quot;title&quot;: &quot;A String&quot;, # Required. Title of the carousel item.
589 &quot;image&quot;: { # The image response message. # Optional. The image to display.
590 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
591 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
592 # e.g., screen readers.
593 },
594 &quot;info&quot;: { # Additional info about the select item for when it is triggered in a # Required. Additional info about the option item.
595 # dialog.
596 &quot;synonyms&quot;: [ # Optional. A list of synonyms that can also be used to trigger this
597 # item in dialog.
598 &quot;A String&quot;,
599 ],
600 &quot;key&quot;: &quot;A String&quot;, # Required. A unique key that will be sent back to the agent if this
601 # response is given.
602 },
603 },
604 ],
605 },
606 &quot;simpleResponses&quot;: { # The collection of simple response candidates. # The voice and text-only responses for Actions on Google.
607 # This message in `QueryResult.fulfillment_messages` and
608 # `WebhookResponse.fulfillment_messages` should contain only one
609 # `SimpleResponse`.
610 &quot;simpleResponses&quot;: [ # Required. The list of simple responses.
611 { # The simple response message containing speech or text.
612 &quot;textToSpeech&quot;: &quot;A String&quot;, # One of text_to_speech or ssml must be provided. The plain text of the
613 # speech output. Mutually exclusive with ssml.
614 &quot;ssml&quot;: &quot;A String&quot;, # One of text_to_speech or ssml must be provided. Structured spoken
615 # response to the user in the SSML format. Mutually exclusive with
616 # text_to_speech.
617 &quot;displayText&quot;: &quot;A String&quot;, # Optional. The text to display.
618 },
619 ],
620 },
621 &quot;platform&quot;: &quot;A String&quot;, # Optional. The platform that this message is intended for.
622 &quot;browseCarouselCard&quot;: { # Browse Carousel Card for Actions on Google. # Browse carousel card for Actions on Google.
623 # https://developers.google.com/actions/assistant/responses#browsing_carousel
624 &quot;items&quot;: [ # Required. List of items in the Browse Carousel Card. Minimum of two
625 # items, maximum of ten.
626 { # Browsing carousel tile
627 &quot;footer&quot;: &quot;A String&quot;, # Optional. Text that appears at the bottom of the Browse Carousel
628 # Card. Maximum of one line of text.
629 &quot;image&quot;: { # The image response message. # Optional. Hero image for the carousel item.
630 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
631 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
632 # e.g., screen readers.
633 },
634 &quot;description&quot;: &quot;A String&quot;, # Optional. Description of the carousel item. Maximum of four lines of
635 # text.
636 &quot;title&quot;: &quot;A String&quot;, # Required. Title of the carousel item. Maximum of two lines of text.
637 &quot;openUriAction&quot;: { # Actions on Google action to open a given url. # Required. Action to present to the user.
638 &quot;url&quot;: &quot;A String&quot;, # Required. URL
639 &quot;urlTypeHint&quot;: &quot;A String&quot;, # Optional. Specifies the type of viewer that is used when opening
640 # the URL. Defaults to opening via web browser.
641 },
642 },
643 ],
644 &quot;imageDisplayOptions&quot;: &quot;A String&quot;, # Optional. Settings for displaying the image. Applies to every image in
645 # items.
646 },
647 &quot;linkOutSuggestion&quot;: { # The suggestion chip message that allows the user to jump out to the app # The link out suggestion chip for Actions on Google.
648 # or website associated with this agent.
649 &quot;uri&quot;: &quot;A String&quot;, # Required. The URI of the app or site to open when the user taps the
650 # suggestion chip.
651 &quot;destinationName&quot;: &quot;A String&quot;, # Required. The name of the app or site this chip is linking to.
652 },
653 &quot;basicCard&quot;: { # The basic card message. Useful for displaying information. # The basic card response for Actions on Google.
654 &quot;buttons&quot;: [ # Optional. The collection of card buttons.
655 { # The button object that appears at the bottom of a card.
656 &quot;title&quot;: &quot;A String&quot;, # Required. The title of the button.
657 &quot;openUriAction&quot;: { # Opens the given URI. # Required. Action to take when a user taps on the button.
658 &quot;uri&quot;: &quot;A String&quot;, # Required. The HTTP or HTTPS scheme URI.
659 },
660 },
661 ],
662 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. The subtitle of the card.
663 &quot;formattedText&quot;: &quot;A String&quot;, # Required, unless image is present. The body text of the card.
664 &quot;title&quot;: &quot;A String&quot;, # Optional. The title of the card.
665 &quot;image&quot;: { # The image response message. # Optional. The image for the card.
666 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
667 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
668 # e.g., screen readers.
669 },
670 },
671 &quot;suggestions&quot;: { # The collection of suggestions. # The suggestion chips for Actions on Google.
672 &quot;suggestions&quot;: [ # Required. The list of suggested replies.
673 { # The suggestion chip message that the user can tap to quickly post a reply
674 # to the conversation.
675 &quot;title&quot;: &quot;A String&quot;, # Required. The text shown the in the suggestion chip.
676 },
677 ],
678 },
679 &quot;quickReplies&quot;: { # The quick replies response message. # The quick replies response.
680 &quot;quickReplies&quot;: [ # Optional. The collection of quick replies.
681 &quot;A String&quot;,
682 ],
683 &quot;title&quot;: &quot;A String&quot;, # Optional. The title of the collection of quick replies.
684 },
685 &quot;tableCard&quot;: { # Table card for Actions on Google. # Table card for Actions on Google.
686 &quot;title&quot;: &quot;A String&quot;, # Required. Title of the card.
687 &quot;columnProperties&quot;: [ # Optional. Display properties for the columns in this table.
688 { # Column properties for TableCard.
689 &quot;header&quot;: &quot;A String&quot;, # Required. Column heading.
690 &quot;horizontalAlignment&quot;: &quot;A String&quot;, # Optional. Defines text alignment for all cells in this column.
691 },
692 ],
693 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. Subtitle to the title.
694 &quot;image&quot;: { # The image response message. # Optional. Image which should be displayed on the card.
695 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
696 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
697 # e.g., screen readers.
698 },
699 &quot;rows&quot;: [ # Optional. Rows in this table of data.
700 { # Row of TableCard.
701 &quot;cells&quot;: [ # Optional. List of cells that make up this row.
702 { # Cell of TableCardRow.
703 &quot;text&quot;: &quot;A String&quot;, # Required. Text in this cell.
704 },
705 ],
706 &quot;dividerAfter&quot;: True or False, # Optional. Whether to add a visual divider after this row.
707 },
708 ],
709 &quot;buttons&quot;: [ # Optional. List of buttons for the card.
710 { # The button object that appears at the bottom of a card.
711 &quot;title&quot;: &quot;A String&quot;, # Required. The title of the button.
712 &quot;openUriAction&quot;: { # Opens the given URI. # Required. Action to take when a user taps on the button.
713 &quot;uri&quot;: &quot;A String&quot;, # Required. The HTTP or HTTPS scheme URI.
714 },
715 },
716 ],
717 },
718 &quot;image&quot;: { # The image response message. # The image response.
719 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
720 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
721 # e.g., screen readers.
722 },
723 &quot;mediaContent&quot;: { # The media content card for Actions on Google. # The media content card for Actions on Google.
724 &quot;mediaObjects&quot;: [ # Required. List of media objects.
725 { # Response media object for media content card.
726 &quot;largeImage&quot;: { # The image response message. # Optional. Image to display above media content.
727 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
728 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
729 # e.g., screen readers.
730 },
731 &quot;contentUrl&quot;: &quot;A String&quot;, # Required. Url where the media is stored.
732 &quot;icon&quot;: { # The image response message. # Optional. Icon to display above media content.
733 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
734 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
735 # e.g., screen readers.
736 },
737 &quot;name&quot;: &quot;A String&quot;, # Required. Name of media card.
738 &quot;description&quot;: &quot;A String&quot;, # Optional. Description of media card.
739 },
740 ],
741 &quot;mediaType&quot;: &quot;A String&quot;, # Optional. What type of media is the content (ie &quot;audio&quot;).
742 },
743 &quot;listSelect&quot;: { # The card for presenting a list of options to select from. # The list card response for Actions on Google.
744 &quot;title&quot;: &quot;A String&quot;, # Optional. The overall title of the list.
745 &quot;items&quot;: [ # Required. List items.
746 { # An item in the list.
747 &quot;image&quot;: { # The image response message. # Optional. The image to display.
748 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
749 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
750 # e.g., screen readers.
751 },
752 &quot;info&quot;: { # Additional info about the select item for when it is triggered in a # Required. Additional information about this option.
753 # dialog.
754 &quot;synonyms&quot;: [ # Optional. A list of synonyms that can also be used to trigger this
755 # item in dialog.
756 &quot;A String&quot;,
757 ],
758 &quot;key&quot;: &quot;A String&quot;, # Required. A unique key that will be sent back to the agent if this
759 # response is given.
760 },
761 &quot;title&quot;: &quot;A String&quot;, # Required. The title of the list item.
762 &quot;description&quot;: &quot;A String&quot;, # Optional. The main text describing the item.
763 },
764 ],
765 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. Subtitle of the list.
766 },
767 &quot;payload&quot;: { # A custom platform-specific response.
768 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
769 },
770 },
771 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700772 &quot;events&quot;: [ # Optional. The collection of event names that trigger the intent.
773 # If the collection of input contexts is not empty, all of the contexts must
774 # be present in the active user session for an event to trigger this intent.
775 # Event names are limited to 150 characters.
776 &quot;A String&quot;,
777 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700778 &quot;outputContexts&quot;: [ # Optional. The collection of contexts that are activated when the intent
779 # is matched. Context messages in this collection should not set the
780 # parameters field. Setting the `lifespan_count` to 0 will reset the context
781 # when the intent is matched.
782 # Format: `projects/&lt;Project ID&gt;/agent/sessions/-/contexts/&lt;Context ID&gt;`.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700783 { # Dialogflow contexts are similar to natural language context. If a person says
784 # to you &quot;they are orange&quot;, you need context in order to understand what &quot;they&quot;
785 # is referring to. Similarly, for Dialogflow to handle an end-user expression
786 # like that, it needs to be provided with context in order to correctly match
787 # an intent.
788 #
789 # Using contexts, you can control the flow of a conversation. You can configure
790 # contexts for an intent by setting input and output contexts, which are
791 # identified by string names. When an intent is matched, any configured output
792 # contexts for that intent become active. While any contexts are active,
793 # Dialogflow is more likely to match intents that are configured with input
794 # contexts that correspond to the currently active contexts.
795 #
796 # For more information about context, see the
797 # [Contexts guide](https://cloud.google.com/dialogflow/docs/contexts-overview).
Bu Sun Kim65020912020-05-20 12:08:20 -0700798 &quot;parameters&quot;: { # Optional. The collection of parameters associated with this context.
799 #
800 # Depending on your protocol or client library language, this is a
801 # map, associative array, symbol table, dictionary, or JSON object
802 # composed of a collection of (MapKey, MapValue) pairs:
803 #
804 # - MapKey type: string
805 # - MapKey value: parameter name
806 # - MapValue type:
807 # - If parameter&#x27;s entity type is a composite entity: map
808 # - Else: string or number, depending on parameter value type
809 # - MapValue value:
810 # - If parameter&#x27;s entity type is a composite entity:
811 # map from composite entity property names to property values
812 # - Else: parameter value
813 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
814 },
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700815 &quot;name&quot;: &quot;A String&quot;, # Required. The unique identifier of the context. Format:
816 # `projects/&lt;Project ID&gt;/agent/sessions/&lt;Session ID&gt;/contexts/&lt;Context ID&gt;`,
817 # or `projects/&lt;Project ID&gt;/agent/environments/&lt;Environment ID&gt;/users/&lt;User
818 # ID&gt;/sessions/&lt;Session ID&gt;/contexts/&lt;Context ID&gt;`.
819 #
820 # The `Context ID` is always converted to lowercase, may only contain
821 # characters in a-zA-Z0-9_-% and may be at most 250 bytes long.
822 #
823 # If `Environment ID` is not specified, we assume default &#x27;draft&#x27;
824 # environment. If `User ID` is not specified, we assume default &#x27;-&#x27; user.
825 #
826 # The following context names are reserved for internal use by Dialogflow.
827 # You should not use these contexts or create contexts with these names:
828 #
829 # * `__system_counters__`
830 # * `*_id_dialog_context`
831 # * `*_dialog_params_size`
832 &quot;lifespanCount&quot;: 42, # Optional. The number of conversational query requests after which the
833 # context expires. The default is `0`. If set to `0`, the context expires
834 # immediately. Contexts expire automatically after 20 minutes if there
835 # are no matching queries.
Bu Sun Kim65020912020-05-20 12:08:20 -0700836 },
837 ],
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700838 &quot;action&quot;: &quot;A String&quot;, # Optional. The name of the action associated with the intent.
839 # Note: The action name must not contain whitespaces.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700840 &quot;priority&quot;: 42, # Optional. The priority of this intent. Higher numbers represent higher
841 # priorities.
842 #
843 # - If the supplied value is unspecified or 0, the service
844 # translates the value to 500,000, which corresponds to the
845 # `Normal` priority in the console.
846 # - If the supplied value is negative, the intent is ignored
847 # in runtime detect intent requests.
Bu Sun Kim65020912020-05-20 12:08:20 -0700848 &quot;rootFollowupIntentName&quot;: &quot;A String&quot;, # Read-only. The unique identifier of the root intent in the chain of
849 # followup intents. It identifies the correct followup intents chain for
850 # this intent. We populate this field only in the output.
851 #
852 # Format: `projects/&lt;Project ID&gt;/agent/intents/&lt;Intent ID&gt;`.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700853 &quot;followupIntentInfo&quot;: [ # Read-only. Information about all followup intents that have this intent as
854 # a direct or indirect parent. We populate this field only in the output.
855 { # Represents a single followup intent in the chain.
856 &quot;parentFollowupIntentName&quot;: &quot;A String&quot;, # The unique identifier of the followup intent&#x27;s parent.
857 # Format: `projects/&lt;Project ID&gt;/agent/intents/&lt;Intent ID&gt;`.
858 &quot;followupIntentName&quot;: &quot;A String&quot;, # The unique identifier of the followup intent.
859 # Format: `projects/&lt;Project ID&gt;/agent/intents/&lt;Intent ID&gt;`.
860 },
861 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700862 &quot;trainingPhrases&quot;: [ # Optional. The collection of examples that the agent is
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700863 # trained on.
864 { # Represents an example that the agent is trained on.
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700865 &quot;name&quot;: &quot;A String&quot;, # Output only. The unique identifier of this training phrase.
866 &quot;timesAddedCount&quot;: 42, # Optional. Indicates how many times this example was added to
867 # the intent. Each time a developer adds an existing sample by editing an
868 # intent or training, this counter is increased.
Bu Sun Kim65020912020-05-20 12:08:20 -0700869 &quot;parts&quot;: [ # Required. The ordered list of training phrase parts.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700870 # The parts are concatenated in order to form the training phrase.
871 #
872 # Note: The API does not automatically annotate training phrases like the
873 # Dialogflow Console does.
874 #
875 # Note: Do not forget to include whitespace at part boundaries,
876 # so the training phrase is well formatted when the parts are concatenated.
877 #
878 # If the training phrase does not need to be annotated with parameters,
879 # you just need a single part with only the Part.text field set.
880 #
881 # If you want to annotate the training phrase, you must create multiple
882 # parts, where the fields of each part are populated in one of two ways:
883 #
884 # - `Part.text` is set to a part of the phrase that has no parameters.
885 # - `Part.text` is set to a part of the phrase that you want to annotate,
886 # and the `entity_type`, `alias`, and `user_defined` fields are all
887 # set.
888 { # Represents a part of a training phrase.
Bu Sun Kim65020912020-05-20 12:08:20 -0700889 &quot;alias&quot;: &quot;A String&quot;, # Optional. The parameter name for the value extracted from the
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700890 # annotated part of the example.
891 # This field is required for annotated parts of the training phrase.
Bu Sun Kim65020912020-05-20 12:08:20 -0700892 &quot;userDefined&quot;: True or False, # Optional. Indicates whether the text was manually annotated.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700893 # This field is set to true when the Dialogflow Console is used to
894 # manually annotate the part. When creating an annotated part with the
895 # API, you must set this to true.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700896 &quot;text&quot;: &quot;A String&quot;, # Required. The text for this part.
897 &quot;entityType&quot;: &quot;A String&quot;, # Optional. The entity type name prefixed with `@`.
898 # This field is required for annotated parts of the training phrase.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700899 },
900 ],
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700901 &quot;type&quot;: &quot;A String&quot;, # Required. The type of the training phrase.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700902 },
903 ],
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700904 &quot;inputContextNames&quot;: [ # Optional. The list of context names required for this intent to be
905 # triggered.
906 # Format: `projects/&lt;Project ID&gt;/agent/sessions/-/contexts/&lt;Context ID&gt;`.
907 &quot;A String&quot;,
908 ],
909 &quot;mlDisabled&quot;: True or False, # Optional. Indicates whether Machine Learning is disabled for the intent.
910 # Note: If `ml_disabled` setting is set to true, then this intent is not
911 # taken into account during inference in `ML ONLY` match mode. Also,
912 # auto-markup in the UI is turned off.
Bu Sun Kim65020912020-05-20 12:08:20 -0700913 &quot;resetContexts&quot;: True or False, # Optional. Indicates whether to delete all contexts in the current
Dan O'Mearadd494642020-05-01 07:42:23 -0700914 # session when this intent is matched.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700915 &quot;parentFollowupIntentName&quot;: &quot;A String&quot;, # Read-only after creation. The unique identifier of the parent intent in the
916 # chain of followup intents. You can set this field when creating an intent,
917 # for example with CreateIntent or
918 # BatchUpdateIntents, in order to make this
919 # intent a followup intent.
920 #
921 # It identifies the parent followup intent.
922 # Format: `projects/&lt;Project ID&gt;/agent/intents/&lt;Intent ID&gt;`.
923 &quot;defaultResponsePlatforms&quot;: [ # Optional. The list of platforms for which the first responses will be
924 # copied from the messages in PLATFORM_UNSPECIFIED (i.e. default platform).
925 &quot;A String&quot;,
926 ],
Bu Sun Kim65020912020-05-20 12:08:20 -0700927 &quot;parameters&quot;: [ # Optional. The collection of parameters associated with the intent.
928 { # Represents intent parameters.
Bu Sun Kim65020912020-05-20 12:08:20 -0700929 &quot;value&quot;: &quot;A String&quot;, # Optional. The definition of the parameter value. It can be:
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700930 #
Bu Sun Kim65020912020-05-20 12:08:20 -0700931 # - a constant string,
932 # - a parameter value defined as `$parameter_name`,
933 # - an original parameter value defined as `$parameter_name.original`,
934 # - a parameter value from some context defined as
935 # `#context_name.parameter_name`.
936 &quot;displayName&quot;: &quot;A String&quot;, # Required. The name of the parameter.
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700937 &quot;mandatory&quot;: True or False, # Optional. Indicates whether the parameter is required. That is,
938 # whether the intent cannot be completed without collecting the parameter
939 # value.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700940 &quot;isList&quot;: True or False, # Optional. Indicates whether the parameter represents a list of values.
941 &quot;entityTypeDisplayName&quot;: &quot;A String&quot;, # Optional. The name of the entity type, prefixed with `@`, that
942 # describes values of the parameter. If the parameter is
943 # required, this must be provided.
Bu Sun Kim4ed7d3f2020-05-27 12:20:54 -0700944 &quot;defaultValue&quot;: &quot;A String&quot;, # Optional. The default value to use when the `value` yields an empty
945 # result.
946 # Default values can be extracted from contexts by using the following
947 # syntax: `#context_name.parameter_name`.
948 &quot;name&quot;: &quot;A String&quot;, # The unique identifier of this parameter.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700949 &quot;prompts&quot;: [ # Optional. The collection of prompts that the agent can present to the
950 # user in order to collect a value for the parameter.
951 &quot;A String&quot;,
952 ],
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700953 },
954 ],
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700955 },
Bu Sun Kim65020912020-05-20 12:08:20 -0700956 &quot;languageCode&quot;: &quot;A String&quot;, # The language that was triggered during intent detection.
957 # See [Language
958 # Support](https://cloud.google.com/dialogflow/docs/reference/language)
959 # for a list of the currently supported language codes.
960 &quot;outputContexts&quot;: [ # The collection of output contexts. If applicable,
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700961 # `output_contexts.parameters` contains entries with name
Dan O'Mearadd494642020-05-01 07:42:23 -0700962 # `&lt;parameter name&gt;.original` containing the original parameter values
Bu Sun Kim715bd7f2019-06-14 16:50:42 -0700963 # before the query.
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700964 { # Dialogflow contexts are similar to natural language context. If a person says
965 # to you &quot;they are orange&quot;, you need context in order to understand what &quot;they&quot;
966 # is referring to. Similarly, for Dialogflow to handle an end-user expression
967 # like that, it needs to be provided with context in order to correctly match
968 # an intent.
969 #
970 # Using contexts, you can control the flow of a conversation. You can configure
971 # contexts for an intent by setting input and output contexts, which are
972 # identified by string names. When an intent is matched, any configured output
973 # contexts for that intent become active. While any contexts are active,
974 # Dialogflow is more likely to match intents that are configured with input
975 # contexts that correspond to the currently active contexts.
976 #
977 # For more information about context, see the
978 # [Contexts guide](https://cloud.google.com/dialogflow/docs/contexts-overview).
Bu Sun Kim65020912020-05-20 12:08:20 -0700979 &quot;parameters&quot;: { # Optional. The collection of parameters associated with this context.
Dan O'Mearadd494642020-05-01 07:42:23 -0700980 #
981 # Depending on your protocol or client library language, this is a
982 # map, associative array, symbol table, dictionary, or JSON object
983 # composed of a collection of (MapKey, MapValue) pairs:
984 #
985 # - MapKey type: string
986 # - MapKey value: parameter name
987 # - MapValue type:
Bu Sun Kim65020912020-05-20 12:08:20 -0700988 # - If parameter&#x27;s entity type is a composite entity: map
Dan O'Mearadd494642020-05-01 07:42:23 -0700989 # - Else: string or number, depending on parameter value type
990 # - MapValue value:
Bu Sun Kim65020912020-05-20 12:08:20 -0700991 # - If parameter&#x27;s entity type is a composite entity:
Dan O'Mearadd494642020-05-01 07:42:23 -0700992 # map from composite entity property names to property values
993 # - Else: parameter value
Bu Sun Kim65020912020-05-20 12:08:20 -0700994 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
Dan O'Mearadd494642020-05-01 07:42:23 -0700995 },
Bu Sun Kimd059ad82020-07-22 17:02:09 -0700996 &quot;name&quot;: &quot;A String&quot;, # Required. The unique identifier of the context. Format:
997 # `projects/&lt;Project ID&gt;/agent/sessions/&lt;Session ID&gt;/contexts/&lt;Context ID&gt;`,
998 # or `projects/&lt;Project ID&gt;/agent/environments/&lt;Environment ID&gt;/users/&lt;User
999 # ID&gt;/sessions/&lt;Session ID&gt;/contexts/&lt;Context ID&gt;`.
1000 #
1001 # The `Context ID` is always converted to lowercase, may only contain
1002 # characters in a-zA-Z0-9_-% and may be at most 250 bytes long.
1003 #
1004 # If `Environment ID` is not specified, we assume default &#x27;draft&#x27;
1005 # environment. If `User ID` is not specified, we assume default &#x27;-&#x27; user.
1006 #
1007 # The following context names are reserved for internal use by Dialogflow.
1008 # You should not use these contexts or create contexts with these names:
1009 #
1010 # * `__system_counters__`
1011 # * `*_id_dialog_context`
1012 # * `*_dialog_params_size`
1013 &quot;lifespanCount&quot;: 42, # Optional. The number of conversational query requests after which the
1014 # context expires. The default is `0`. If set to `0`, the context expires
1015 # immediately. Contexts expire automatically after 20 minutes if there
1016 # are no matching queries.
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001017 },
1018 ],
Bu Sun Kimd059ad82020-07-22 17:02:09 -07001019 &quot;webhookPayload&quot;: { # If the query was fulfilled by a webhook call, this field is set to the
1020 # value of the `payload` field returned in the webhook response.
1021 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
1022 },
1023 &quot;action&quot;: &quot;A String&quot;, # The action name from the matched intent.
1024 &quot;fulfillmentMessages&quot;: [ # The collection of rich messages to present to the user.
1025 { # A rich response message.
1026 # Corresponds to the intent `Response` field in the Dialogflow console.
1027 # For more information, see
1028 # [Rich response
1029 # messages](https://cloud.google.com/dialogflow/docs/intents-rich-messages).
1030 &quot;card&quot;: { # The card response message. # The card response.
1031 &quot;title&quot;: &quot;A String&quot;, # Optional. The title of the card.
1032 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. The subtitle of the card.
1033 &quot;buttons&quot;: [ # Optional. The collection of card buttons.
1034 { # Contains information about a button.
1035 &quot;text&quot;: &quot;A String&quot;, # Optional. The text to show on the button.
1036 &quot;postback&quot;: &quot;A String&quot;, # Optional. The text to send back to the Dialogflow API or a URI to
1037 # open.
1038 },
1039 ],
1040 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file for the card.
1041 },
1042 &quot;text&quot;: { # The text response message. # The text response.
1043 &quot;text&quot;: [ # Optional. The collection of the agent&#x27;s responses.
1044 &quot;A String&quot;,
1045 ],
1046 },
1047 &quot;carouselSelect&quot;: { # The card for presenting a carousel of options to select from. # The carousel card response for Actions on Google.
1048 &quot;items&quot;: [ # Required. Carousel items.
1049 { # An item in the carousel.
1050 &quot;description&quot;: &quot;A String&quot;, # Optional. The body text of the card.
1051 &quot;title&quot;: &quot;A String&quot;, # Required. Title of the carousel item.
1052 &quot;image&quot;: { # The image response message. # Optional. The image to display.
1053 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1054 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1055 # e.g., screen readers.
1056 },
1057 &quot;info&quot;: { # Additional info about the select item for when it is triggered in a # Required. Additional info about the option item.
1058 # dialog.
1059 &quot;synonyms&quot;: [ # Optional. A list of synonyms that can also be used to trigger this
1060 # item in dialog.
1061 &quot;A String&quot;,
1062 ],
1063 &quot;key&quot;: &quot;A String&quot;, # Required. A unique key that will be sent back to the agent if this
1064 # response is given.
1065 },
1066 },
1067 ],
1068 },
1069 &quot;simpleResponses&quot;: { # The collection of simple response candidates. # The voice and text-only responses for Actions on Google.
1070 # This message in `QueryResult.fulfillment_messages` and
1071 # `WebhookResponse.fulfillment_messages` should contain only one
1072 # `SimpleResponse`.
1073 &quot;simpleResponses&quot;: [ # Required. The list of simple responses.
1074 { # The simple response message containing speech or text.
1075 &quot;textToSpeech&quot;: &quot;A String&quot;, # One of text_to_speech or ssml must be provided. The plain text of the
1076 # speech output. Mutually exclusive with ssml.
1077 &quot;ssml&quot;: &quot;A String&quot;, # One of text_to_speech or ssml must be provided. Structured spoken
1078 # response to the user in the SSML format. Mutually exclusive with
1079 # text_to_speech.
1080 &quot;displayText&quot;: &quot;A String&quot;, # Optional. The text to display.
1081 },
1082 ],
1083 },
1084 &quot;platform&quot;: &quot;A String&quot;, # Optional. The platform that this message is intended for.
1085 &quot;browseCarouselCard&quot;: { # Browse Carousel Card for Actions on Google. # Browse carousel card for Actions on Google.
1086 # https://developers.google.com/actions/assistant/responses#browsing_carousel
1087 &quot;items&quot;: [ # Required. List of items in the Browse Carousel Card. Minimum of two
1088 # items, maximum of ten.
1089 { # Browsing carousel tile
1090 &quot;footer&quot;: &quot;A String&quot;, # Optional. Text that appears at the bottom of the Browse Carousel
1091 # Card. Maximum of one line of text.
1092 &quot;image&quot;: { # The image response message. # Optional. Hero image for the carousel item.
1093 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1094 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1095 # e.g., screen readers.
1096 },
1097 &quot;description&quot;: &quot;A String&quot;, # Optional. Description of the carousel item. Maximum of four lines of
1098 # text.
1099 &quot;title&quot;: &quot;A String&quot;, # Required. Title of the carousel item. Maximum of two lines of text.
1100 &quot;openUriAction&quot;: { # Actions on Google action to open a given url. # Required. Action to present to the user.
1101 &quot;url&quot;: &quot;A String&quot;, # Required. URL
1102 &quot;urlTypeHint&quot;: &quot;A String&quot;, # Optional. Specifies the type of viewer that is used when opening
1103 # the URL. Defaults to opening via web browser.
1104 },
1105 },
1106 ],
1107 &quot;imageDisplayOptions&quot;: &quot;A String&quot;, # Optional. Settings for displaying the image. Applies to every image in
1108 # items.
1109 },
1110 &quot;linkOutSuggestion&quot;: { # The suggestion chip message that allows the user to jump out to the app # The link out suggestion chip for Actions on Google.
1111 # or website associated with this agent.
1112 &quot;uri&quot;: &quot;A String&quot;, # Required. The URI of the app or site to open when the user taps the
1113 # suggestion chip.
1114 &quot;destinationName&quot;: &quot;A String&quot;, # Required. The name of the app or site this chip is linking to.
1115 },
1116 &quot;basicCard&quot;: { # The basic card message. Useful for displaying information. # The basic card response for Actions on Google.
1117 &quot;buttons&quot;: [ # Optional. The collection of card buttons.
1118 { # The button object that appears at the bottom of a card.
1119 &quot;title&quot;: &quot;A String&quot;, # Required. The title of the button.
1120 &quot;openUriAction&quot;: { # Opens the given URI. # Required. Action to take when a user taps on the button.
1121 &quot;uri&quot;: &quot;A String&quot;, # Required. The HTTP or HTTPS scheme URI.
1122 },
1123 },
1124 ],
1125 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. The subtitle of the card.
1126 &quot;formattedText&quot;: &quot;A String&quot;, # Required, unless image is present. The body text of the card.
1127 &quot;title&quot;: &quot;A String&quot;, # Optional. The title of the card.
1128 &quot;image&quot;: { # The image response message. # Optional. The image for the card.
1129 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1130 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1131 # e.g., screen readers.
1132 },
1133 },
1134 &quot;suggestions&quot;: { # The collection of suggestions. # The suggestion chips for Actions on Google.
1135 &quot;suggestions&quot;: [ # Required. The list of suggested replies.
1136 { # The suggestion chip message that the user can tap to quickly post a reply
1137 # to the conversation.
1138 &quot;title&quot;: &quot;A String&quot;, # Required. The text shown the in the suggestion chip.
1139 },
1140 ],
1141 },
1142 &quot;quickReplies&quot;: { # The quick replies response message. # The quick replies response.
1143 &quot;quickReplies&quot;: [ # Optional. The collection of quick replies.
1144 &quot;A String&quot;,
1145 ],
1146 &quot;title&quot;: &quot;A String&quot;, # Optional. The title of the collection of quick replies.
1147 },
1148 &quot;tableCard&quot;: { # Table card for Actions on Google. # Table card for Actions on Google.
1149 &quot;title&quot;: &quot;A String&quot;, # Required. Title of the card.
1150 &quot;columnProperties&quot;: [ # Optional. Display properties for the columns in this table.
1151 { # Column properties for TableCard.
1152 &quot;header&quot;: &quot;A String&quot;, # Required. Column heading.
1153 &quot;horizontalAlignment&quot;: &quot;A String&quot;, # Optional. Defines text alignment for all cells in this column.
1154 },
1155 ],
1156 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. Subtitle to the title.
1157 &quot;image&quot;: { # The image response message. # Optional. Image which should be displayed on the card.
1158 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1159 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1160 # e.g., screen readers.
1161 },
1162 &quot;rows&quot;: [ # Optional. Rows in this table of data.
1163 { # Row of TableCard.
1164 &quot;cells&quot;: [ # Optional. List of cells that make up this row.
1165 { # Cell of TableCardRow.
1166 &quot;text&quot;: &quot;A String&quot;, # Required. Text in this cell.
1167 },
1168 ],
1169 &quot;dividerAfter&quot;: True or False, # Optional. Whether to add a visual divider after this row.
1170 },
1171 ],
1172 &quot;buttons&quot;: [ # Optional. List of buttons for the card.
1173 { # The button object that appears at the bottom of a card.
1174 &quot;title&quot;: &quot;A String&quot;, # Required. The title of the button.
1175 &quot;openUriAction&quot;: { # Opens the given URI. # Required. Action to take when a user taps on the button.
1176 &quot;uri&quot;: &quot;A String&quot;, # Required. The HTTP or HTTPS scheme URI.
1177 },
1178 },
1179 ],
1180 },
1181 &quot;image&quot;: { # The image response message. # The image response.
1182 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1183 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1184 # e.g., screen readers.
1185 },
1186 &quot;mediaContent&quot;: { # The media content card for Actions on Google. # The media content card for Actions on Google.
1187 &quot;mediaObjects&quot;: [ # Required. List of media objects.
1188 { # Response media object for media content card.
1189 &quot;largeImage&quot;: { # The image response message. # Optional. Image to display above media content.
1190 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1191 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1192 # e.g., screen readers.
1193 },
1194 &quot;contentUrl&quot;: &quot;A String&quot;, # Required. Url where the media is stored.
1195 &quot;icon&quot;: { # The image response message. # Optional. Icon to display above media content.
1196 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1197 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1198 # e.g., screen readers.
1199 },
1200 &quot;name&quot;: &quot;A String&quot;, # Required. Name of media card.
1201 &quot;description&quot;: &quot;A String&quot;, # Optional. Description of media card.
1202 },
1203 ],
1204 &quot;mediaType&quot;: &quot;A String&quot;, # Optional. What type of media is the content (ie &quot;audio&quot;).
1205 },
1206 &quot;listSelect&quot;: { # The card for presenting a list of options to select from. # The list card response for Actions on Google.
1207 &quot;title&quot;: &quot;A String&quot;, # Optional. The overall title of the list.
1208 &quot;items&quot;: [ # Required. List items.
1209 { # An item in the list.
1210 &quot;image&quot;: { # The image response message. # Optional. The image to display.
1211 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1212 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1213 # e.g., screen readers.
1214 },
1215 &quot;info&quot;: { # Additional info about the select item for when it is triggered in a # Required. Additional information about this option.
1216 # dialog.
1217 &quot;synonyms&quot;: [ # Optional. A list of synonyms that can also be used to trigger this
1218 # item in dialog.
1219 &quot;A String&quot;,
1220 ],
1221 &quot;key&quot;: &quot;A String&quot;, # Required. A unique key that will be sent back to the agent if this
1222 # response is given.
1223 },
1224 &quot;title&quot;: &quot;A String&quot;, # Required. The title of the list item.
1225 &quot;description&quot;: &quot;A String&quot;, # Optional. The main text describing the item.
1226 },
1227 ],
1228 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. Subtitle of the list.
1229 },
1230 &quot;payload&quot;: { # A custom platform-specific response.
1231 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
1232 },
1233 },
1234 ],
1235 &quot;webhookSource&quot;: &quot;A String&quot;, # If the query was fulfilled by a webhook call, this field is set to the
1236 # value of the `source` field returned in the webhook response.
1237 &quot;allRequiredParamsPresent&quot;: True or False, # This field is set to:
1238 #
1239 # - `false` if the matched intent has required parameters and not all of
1240 # the required parameter values have been collected.
1241 # - `true` if all required parameter values have been collected, or if the
1242 # matched intent doesn&#x27;t contain any required parameters.
1243 &quot;speechRecognitionConfidence&quot;: 3.14, # The Speech recognition confidence between 0.0 and 1.0. A higher number
1244 # indicates an estimated greater likelihood that the recognized words are
1245 # correct. The default of 0.0 is a sentinel value indicating that confidence
1246 # was not set.
1247 #
1248 # This field is not guaranteed to be accurate or set. In particular this
1249 # field isn&#x27;t set for StreamingDetectIntent since the streaming endpoint has
1250 # separate confidence estimates per portion of the audio in
1251 # StreamingRecognitionResult.
1252 &quot;fulfillmentText&quot;: &quot;A String&quot;, # The text to be pronounced to the user or shown on the screen.
1253 # Note: This is a legacy field, `fulfillment_messages` should be preferred.
1254 &quot;sentimentAnalysisResult&quot;: { # The result of sentiment analysis. Sentiment analysis inspects user input # The sentiment analysis result, which depends on the
1255 # `sentiment_analysis_request_config` specified in the request.
1256 # and identifies the prevailing subjective opinion, especially to determine a
1257 # user&#x27;s attitude as positive, negative, or neutral.
1258 # For Participants.AnalyzeContent, it needs to be configured in
1259 # DetectIntentRequest.query_params. For
1260 # Participants.StreamingAnalyzeContent, it needs to be configured in
1261 # StreamingDetectIntentRequest.query_params.
1262 # And for Participants.AnalyzeContent and
1263 # Participants.StreamingAnalyzeContent, it needs to be configured in
1264 # ConversationProfile.human_agent_assistant_config
1265 &quot;queryTextSentiment&quot;: { # The sentiment, such as positive/negative feeling or association, for a unit # The sentiment analysis result for `query_text`.
1266 # of analysis, such as the query text.
1267 &quot;magnitude&quot;: 3.14, # A non-negative number in the [0, +inf) range, which represents the absolute
1268 # magnitude of sentiment, regardless of score (positive or negative).
1269 &quot;score&quot;: 3.14, # Sentiment score between -1.0 (negative sentiment) and 1.0 (positive
1270 # sentiment).
1271 },
1272 },
1273 &quot;intentDetectionConfidence&quot;: 3.14, # The intent detection confidence. Values range from 0.0
1274 # (completely uncertain) to 1.0 (completely certain).
1275 # This value is for informational purpose only and is only used to
1276 # help match the best intent within the classification threshold.
1277 # This value may change for the same end-user expression at any time due to a
1278 # model retraining or change in implementation.
1279 # If there are `multiple knowledge_answers` messages, this value is set to
1280 # the greatest `knowledgeAnswers.match_confidence` value in the list.
1281 &quot;parameters&quot;: { # The collection of extracted parameters.
1282 #
1283 # Depending on your protocol or client library language, this is a
1284 # map, associative array, symbol table, dictionary, or JSON object
1285 # composed of a collection of (MapKey, MapValue) pairs:
1286 #
1287 # - MapKey type: string
1288 # - MapKey value: parameter name
1289 # - MapValue type:
1290 # - If parameter&#x27;s entity type is a composite entity: map
1291 # - Else: string or number, depending on parameter value type
1292 # - MapValue value:
1293 # - If parameter&#x27;s entity type is a composite entity:
1294 # map from composite entity property names to property values
1295 # - Else: parameter value
1296 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
1297 },
1298 &quot;queryText&quot;: &quot;A String&quot;, # The original conversational query text:
1299 #
1300 # - If natural language text was provided as input, `query_text` contains
1301 # a copy of the input.
1302 # - If natural language speech audio was provided as input, `query_text`
1303 # contains the speech recognition result. If speech recognizer produced
1304 # multiple alternatives, a particular one is picked.
1305 # - If automatic spell correction is enabled, `query_text` will contain the
1306 # corrected user input.
1307 &quot;diagnosticInfo&quot;: { # Free-form diagnostic information for the associated detect intent request.
1308 # The fields of this data can change without notice, so you should not write
1309 # code that depends on its structure.
1310 # The data may contain:
1311 #
1312 # - webhook call latency
1313 # - webhook errors
1314 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
1315 },
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001316 },
Bu Sun Kim65020912020-05-20 12:08:20 -07001317 &quot;webhookStatus&quot;: { # The `Status` type defines a logical error model that is suitable for # Specifies the status of the webhook request.
Dan O'Mearadd494642020-05-01 07:42:23 -07001318 # different programming environments, including REST APIs and RPC APIs. It is
1319 # used by [gRPC](https://github.com/grpc). Each `Status` message contains
1320 # three pieces of data: error code, error message, and error details.
1321 #
1322 # You can find out more about this error model and how to work with it in the
1323 # [API Design Guide](https://cloud.google.com/apis/design/errors).
Bu Sun Kimd059ad82020-07-22 17:02:09 -07001324 &quot;code&quot;: 42, # The status code, which should be an enum value of google.rpc.Code.
1325 &quot;message&quot;: &quot;A String&quot;, # A developer-facing error message, which should be in English. Any
1326 # user-facing error message should be localized and sent in the
1327 # google.rpc.Status.details field, or localized by the client.
Bu Sun Kim65020912020-05-20 12:08:20 -07001328 &quot;details&quot;: [ # A list of messages that carry the error details. There is a common set of
Dan O'Mearadd494642020-05-01 07:42:23 -07001329 # message types for APIs to use.
1330 {
Bu Sun Kim65020912020-05-20 12:08:20 -07001331 &quot;a_key&quot;: &quot;&quot;, # Properties of the object. Contains field @type with type URL.
Dan O'Mearadd494642020-05-01 07:42:23 -07001332 },
1333 ],
1334 },
Bu Sun Kim715bd7f2019-06-14 16:50:42 -07001335 }</pre>
1336</div>
1337
1338</body></html>