You are on page 1of 20

NLG:

Natural Language
Generation
Diplomado de especialización de
desarrollo de aplicaciones con Inteligencia Artificial
Colaboradores:
Erasmo Gomez
Franco Pariasca
Rodrigo Lopez

The art of generating text from structured data using
AI
What is NLG?

“Process of producing meaningful phrases and sentences in the form


of natural language.”
“Whereas NLP is focused on deriving analytic insights from textual data, NLG is used
to synthesize textual content by combining analytic output with contextualized
narratives.”

Artificial Intelligence: Natural Language


Processing Fundamentals
How NLG works?

◎ Formerly used with string templates. ◎ Projects analyse the data, make sense
of it, and draws conclusions.
◎ The predetermined order that templates
are arranged into is known as a ◎ The system is also taught to combine
‘storyplot’. different information (‘messages’),
depending on the underlying data.
◎ Use big data to spot newsworthy
developments such as a sharp and ◎ Variation is made possible by adding
unexpected increase in one value. alternative words, phrases, adverbs,
and synonyms that the system can
choose from.
How is NLG used?
How is NLG used?
Stages of NLG

◎ Document planning: deciding what is to be said


and creating an abstract document that outlines
the structure of the information to be presented.

◎ Microplanning: generation of referring


expressions, word choice, and aggregation to
flesh out the document specifications.

◎ Realisation: converting the abstract document


specifications to a real text, using domain
knowledge about syntax, morphology, etc.
2 requirement for NLG

◎ Content have to make sense in


the context. What we want to
say?.
◎ Use language correctly. How
do you say it?
Where is the ML get involved?

◎ RNN: Character, Word, Phrases base LSTM.

◎ Transformers: new mechanism for self attention. Text Generation.

◎ Language Models: Markov Chains. GPT-N


GPT-N
Generative Pre-trained
Transformers
Language Models
Autoregressive Models

An autoregressive model can


be seen as a model that
utilizes its previous
predictions for generating
new ones. In doing so, it can
continue infinitely, or – in the
case of NLP models – until a
stop signal is predicted.
Autoregressive Models

◎ The input is first embedded. This embedding is a matrix (position


embedding matrix) and hence the actual input is a vector with multiple
tokens.
◎ 12 decoder segments with masked multi-head attention segments,
feedforward segments, and layer normalization segments interpret the
input values.
◎ The output can be a text prediction; in that case, the task is to model
language. However, it can also be used for other tasks, such as similarity
detection and multiple choice answering.
How to choose a model?

If the idea is that the model as a whole transducts (i.e. transforms without
altering semantics) one sequence into another, then we’re talking about a
Seq2Seq model.

If the idea is that you learn an encoded representation of the inputs by


corrupting inputs and generating the original variants, we’re talking about an
autoencoding model.

If the idea is that you use all previous predictions for generating the next
one, in a cyclical fashion, we’re talking about an autoregressive model.
Evaluating language models

◎ Extrinsic evaluation: This involves evaluating the models by employing them in


an actual task (such as machine translation) and looking at their final
loss/accuracy.
◎ Intrinsic evaluation: This involves finding some metric to evaluate the language
model itself, not taking into account the specific tasks it’s going to be used for.
Perplexity is an intrinsic evaluation method.
GPT Original Model
Transformer Masked Self Attention Architecture

◎ Increase in model size and training


on larger data can lead to
improvement in performance
◎ A single model can provide good
performance on a host of NLP
tasks.
◎ Model can infer from new data
without the need for fine-tuning
◎ The model can solve problems on
datasets it has never been trained
upon.
The future ...
Descanso 15 min
● Recomendación musical: Caravan Palace

20

You might also like