@@ -87,5 +87,7 @@ LLM such as nuextract looks promising but ollama currently not working well with
...
@@ -87,5 +87,7 @@ LLM such as nuextract looks promising but ollama currently not working well with
[hugginface](https://huggingface.co/numind/NuExtract)[try it here](https://huggingface.co/spaces/numind/NuExtract) [ollama model](https://ollama.com/library/nuextract)
[hugginface](https://huggingface.co/numind/NuExtract)[try it here](https://huggingface.co/spaces/numind/NuExtract) [ollama model](https://ollama.com/library/nuextract)
## few-shot prompt
few-shot prompt is best realized via `api/chat` instead of `api/generate` in ollama according to [this post](https://github.com/ollama/ollama/issues/2217). where examples were provided as chat histories and submit as `message` instead of `prompt`. `ask_ollama` can take both API implementation, but few-shot prompt has not be tested.