-
-
Save lmcinnes/951de185bd341006a76eece478cc6324 to your computer and use it in GitHub Desktop.
I think you need to have the AzureAI's package installed in the active environment to be able to import its wrapper.
Do you know which package I have to install in Conda environment? I tried azure-code, but I have the same error.
azure-ai-inference
is the one you'll need if you want to use an azure AI foundry model. You may have to pip install it into your conda environment.
I tried to use it, but it's a very complicated service with some limits with EU cards. Do you think it's feasible to use GPT instead?
There are llm and embedding wrappers for most providers here. However, if you are planning to use OpenAI wrappers, make sure to use the latest version of the package from the repo. I made a PR about the OpenAI wrappers a few days ago and it is merged now but there hasn't been a release since then, so the version you would get from pip install toponymy
wouldn't work.
Note that, as with the Azure AI Foundry you will need to install the relevant package to enable it within toponymy. So if you want to use OpenAI then you'll need to install openai
into your environment for toponymy to see it, and so on. Anthropic, Cohere, and OpenAI are all available, as well as local LLMs (assuming you have a GPU) via llamm_cpp, and in the most recent version on github, vLLM.
Note also that, at the time of writing, the async/batch versions of the service wrappers wasn't available, so you may want to consider using those instead as it will be faster. Just prefix with Async to get that to work, so for example AsyncOpenAI
etc.
I am playing with the code, but I have a problem I do not understand here:
ImportError Traceback (most recent call last)
Cell In[48], line 2
1 from toponymy import Toponymy, ToponymyClusterer, KeyphraseBuilder, ClusterLayerText
----> 2 from toponymy.llm_wrappers import AzureAI
3 from toponymy.embedding_wrappers import AzureAIEmbedder
ImportError: cannot import name 'AzureAI' from 'toponymy.llm_wrappers' (/opt/homebrew/Caskroom/miniconda/base/envs/OAPEN/lib/python3.12/site-packages/toponymy/llm_wrappers.py)