few-shots learning can be achieved on LLM via in-context learning.
This example shows how to achieve a data app that translates roman dialect.
The input of the model:
<<<
Translate: {wow}
ammazza
Translate: {wow}
da paura
Translate: {come on}:
daje!
Translate: {feeling tired}:
abbiocco
Translate: {nap}:
'na pennica
Translate: {hell yeah}:
avoja
Translate: {it's hot}:
sto ‘a schiumà
Translate: {let's go}:
damose
Translate: {what's up?}:
>>>
The output:
<<< Che succede? >>>
Workflow
Roman Translator App via LLM
External resources
Used extensions & nodes
Created with KNIME Analytics Platform version 5.2.0
Legal
By using or downloading the workflow, you agree to our terms and conditions.