
About
If you follow AI news even a little bit, you've probably heard of Deepseek. The new AI app built in China which is supposed to be ...
For further actions, you may consider blocking this person and/or reporting abuse
Thank you for this, it was a fun, fairly easy thing to follow along with. I am, however, having one problem. I am getting a 403 Forbidden error when I try to access the web page. I followed the instructions exactly with the exception of using port 3003 instead of 3001, another container was already using 3001. I've installed this on a NAS and am trying to access the page from my laptop, I do not have issues accessing other containers. What might be causing this? Is there something I need to add/change in the docker-compose.yaml?
Hello Guiseppe! Thank you for reading! This might be because of Nginx not getting the contents of your webpage. Are you sure your docker-compose had the correct directory as a volume in the nginx container?
I used the docker-compose exactly as you've shown it with the only change being port 3003 instead of 3001. What wasn't clear for me was what the file structure should be exactly. Currently, I have a folder called deepseek, inside that is the docker-compose.yaml, the ollama-models folder (which does have the blobs, manifests, etc. inside it after starting the container) and the web folder (with the html, js & css files inside). Is this correct or does the web directory go elsewhere?
For reference, I am able to see "ollama is running" at port 11434.
Check this repository to check how the project structure should be:
github.com/SavvasStephanides/local...
I put this on the back burner for a while but finally got back into it today and figured it out, it was a permissions issue. I had to run sudo chown 101:101 -R '/path/to/files' to get it to finally load the page.
I am having another issue though, pushing the Ask! button does nothing. Looking at my browser console I am getting CORS errors about "No 'Access-Control-Allow-Origin' header is present on the requested resource."
The CORS error I was able to fix by setting the environmental variable OLLAMA_ORIGINS to * in my docker-compose.yaml file. The 'Ask!' button not doing anything I was able to fix by running 'ollama run deepseek-r1:7b' in a terminal, letting that load then refreshing my browser pages.
Since you are encountering a 403 Forbidden error when trying to access the web page, here are a few possible causes and solutions:
Check the Web Server Configuration
If the container is running a web server (like Nginx or Apache), check the configuration files to ensure that access is allowed from external sources.
If you are using Nginx, check the nginx.conf or site configuration file to confirm that it is not restricting access based on IP.
Verify Port Binding in docker-compose.yaml
Ensure that the port is correctly mapped in docker-compose.yaml. If you are running the service on port 3003, make sure your YAML file includes:
ports:
Some NAS devices have built-in firewall rules that restrict access to certain ports. Ensure that your NAS allows traffic on port 3003.
You may need to create a rule to allow external access.
Verify File and Folder Permissions
A 403 error can also be caused by permission issues. Ensure that the web server has the necessary permissions to access the files and directories it is serving.
Examine Logs for More Details
Run docker logs to check if there are any error messages related to access permissions.
Thank you, it was a permissions issue, see my post above. Having a different issue now though...
Everything seemed to install OK, and clicking on the links in between steps all showed what was expected. The only issue is nothing happens when I type something in and click "Ask!".
Hello John! Maybe your browser’s console might give you a hint as to why this happens
If not, I finally did get it to work. It seems to be an issue with ollama not downloading the model, per this page. In a terminal I ran 'ollama run deepseek-r1:7b' it took a bit to load but once it did everything started working. I hope this helps anyone who runs into the same issue.
Were you ever able to fix this? I am having the same issue.
No. I ended up moving to OpenAI and ollama docker containers and it's working fine.
Thanks for the great tuts & resources. The steps provided are working fine.
And I'm curious about how to input docs in the prompt & ask.
I'm sure the Ollama docs will have what you need.
I have followed the instructions but the "web" container is unable to access the "ollama" container.
They are both running on a remote server from my web browser.
Also, if I try to curl the localhost:11434 from inside the "web" container it is unable to connect but if I curl ollama:11434, I get the "Ollama is running" response.
Yeap. That's how networking works in Docker. If you need to curl to a container within another container, you can use the name of the container you wish to curl to:
ollama:11434
in this case.If I send a POST directly from the "web" container with the correct data, it simply times out.
Tried moving the containers to an x86_64 server with the same outcome.