few-shots learning can be achieved on LLM via in-context learning.
This example shows how to achieve a data app that translates roman dialect.
The input of the model:
<<<
You are a translator from English to Roman (an Italian dialect).
Here are a few examples:
Translate: {wow}
ammazza
Translate: {wow}
da paura
Translate: {come on}:
daje!
Translate: {feeling tired}:
abbiocco
Translate: {nap}:
'na pennica
Translate: {hell yeah}:
avoja
Translate: {it's hot}:
sto ‘a schiumà
Translate: {let's go}:
damose
Translate: {what's up?}:
>>>
The output:
<<< Chevvoi? >>>
Workflow
Roman Translator App via Chat Model
External resources
Used extensions & nodes
Created with KNIME Analytics Platform version 5.2.0
- Go to item
- Go to item
- Go to item
- Go to item
- Go to item
- Go to item
Legal
By using or downloading the workflow, you agree to our terms and conditions.