I have a Qianwen large language model (LLM) deployed on an internal Linux server. After adding it to “Connect Custom APIs,” it can only call the deepseek deployed locally on my Windows 10 computer via ollama, and it can't be called from anywhere else. What should I do?



I have a Qianwen large language model (LLM) deployed on an internal Linux server. After adding it to “Connect Custom APIs,” it can only call the deepseek deployed locally on my Windows 10 computer via ollama, and it can't be called from anywhere else. What should I do?
