In the rapidly evolving landscape of AI development, local LLM solutions like Ollama have become invaluable tools for developers seeking to harness the power of large language models without relying on cloud-based APIs. However, many developers encounter a common roadblock when integrating Ollama with web applications: Cross-Origin Resource Sharing (CORS) issues.
Understanding the Problem
When I recently attempted to integrate a locally running Ollama instance with a web application, I faced the dreaded CORS error:
Access to fetch at 'http://localhost:11434/api/chat'
from origin 'http://localhost:5173' has been blocked
by CORS policy: Request header field 'authorization' is not allowed by
Access-Control-Allow-Headers in preflight response.
This error occurs because Ollama’s default configuration restricts which origins can access its API and which headers are allowed in requests. While this is a sensible security measure, it creates friction when you’re building applications that need to communicate with Ollama from a different origin – even if that origin is just another port on localhost.
The Root Cause
By default, Ollama binds to 127.0.0.1 (localhost) on port 11434 and only accepts requests from certain origins. When a web application running on a different port or domain tries to access the Ollama API, the browser’s same-origin policy blocks the request unless proper CORS headers are present in the response.
The issues are typically:
- Missing
Access-Control-Allow-Origin
header to permit the requesting domain - Missing
Access-Control-Allow-Headers
configurations for headers likeauthorization
,user-agent
, or custom headers your application might be sending
Solutions for Different Operating Systems
Let’s explore how to resolve these CORS issues across different operating systems.
macOS
On macOS, Ollama runs as an application, and you need to set environment variables using launchctl
:
# Allow all origins
launchctl setenv OLLAMA_ORIGINS "*"
# Or specify particular origins
launchctl setenv OLLAMA_ORIGINS "http://localhost:5173,https://yourdomain.com"
After setting this variable, restart the Ollama application for changes to take effect.
Windows
On Windows, you’ll need to set environment variables through the system:
- Quit the Ollama application from the taskbar
- Open Settings or Control Panel and search for “environment variables”
- Click “Edit environment variables for your account”
- Add a new variable:
- Variable name:
OLLAMA_ORIGINS
- Variable value:
*
(for all origins) or your specific comma-separated origins
- Click OK/Apply to save
- Restart Ollama from the Start menu
Linux
If you’re running Ollama as a systemd service on Linux:
# Edit the service configuration
sudo systemctl edit ollama.service
# Add the following lines
[Service]
Environment="OLLAMA_ORIGINS=*"
# Save and exit, then reload and restart
sudo systemctl daemon-reload
sudo systemctl restart ollama
If you’ve installed Ollama through other means, ensure the environment variable is set before starting the service.
Testing Your Configuration
To verify that your CORS settings are working correctly, you can use curl to simulate a preflight request:
curl -X OPTIONS http://localhost:11434 -H "Origin: http://localhost:5173" -H "Access-Control-Request-Method: GET" -I
With properly configured CORS, you should see a response like:
HTTP/1.1 204 No Content
Access-Control-Allow-Headers: Authorization,Content-Type,User-Agent,Accept,X-Requested-With,X-Stainless-Lang,X-Stainless-Package-Version,X-Stainless-Os,X-Stainless-Arch,X-Stainless-Runtime,X-Stainless-Runtime-Version,X-Stainless-Async
Access-Control-Allow-Methods: GET,POST,PUT,PATCH,DELETE,HEAD,OPTIONS
Access-Control-Allow-Origin: *
Access-Control-Max-Age: 43200
Date: Wed, 09 Apr 2025 10:13:26 GMT
If you’re still seeing a 403 Forbidden response, double-check your configuration and ensure Ollama has been restarted.
Working with Frontend Frameworks
When using modern frontend frameworks with Ollama, you might encounter additional challenges. Here are some framework-specific considerations:
React Applications
For React applications using the OpenAI SDK to communicate with Ollama, you may need to modify how the SDK handles headers. The default OpenAI SDK adds several headers that might be rejected by Ollama’s CORS configuration:
// Create a custom OpenAI client class that removes problematic headers
class OllamaCompatibleOpenAI extends OpenAI {
defaultHeaders(opts) {
return {
'Accept': 'application/json',
'Content-Type': 'application/json',
// We omit auth headers and x-stainless headers that can cause CORS issues
};
}
}
// Initialize with your configuration
const ollamaClient = new OllamaCompatibleOpenAI({
baseURL: 'http://localhost:11434/v1',
dangerouslyAllowBrowser: true,
});
Vue and Other Frameworks
For other frameworks, you might need to set up a proxy in your development server. For example, in a Vue project using Vite:
// vite.config.js
export default defineConfig({
plugins: [vue()],
server: {
proxy: {
'/api': {
target: 'http://localhost:11434',
changeOrigin: true,
rewrite: (path) => path.replace(/^\/api/, ''),
},
},
},
});
Alternative Approaches
If modifying environment variables isn’t feasible in your setup, consider these alternatives:
Using a Proxy Server
You can set up a proxy server like Nginx to handle CORS headers:
server {
listen 8080;
server_name localhost;
location /api {
proxy_pass http://localhost:11434;
proxy_set_header Host 'localhost:11434';
# Add CORS headers
add_header 'Access-Control-Allow-Origin' '*';
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
add_header 'Access-Control-Allow-Headers' 'Origin, Content-Type, Accept, Authorization, X-Requested-With';
# Handle preflight requests
if ($request_method = 'OPTIONS') {
add_header 'Access-Control-Allow-Origin' '*';
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
add_header 'Access-Control-Allow-Headers' 'Origin, Content-Type, Accept, Authorization, X-Requested-With';
add_header 'Content-Type' 'text/plain charset=UTF-8';
add_header 'Content-Length' 0;
return 204;
}
}
}
Node.js Middleware
Another approach is to create a simple Node.js server as middleware:
const express = require('express');
const { createProxyMiddleware } = require('http-proxy-middleware');
const app = express();
app.use('/api', createProxyMiddleware({
target: 'http://localhost:11434',
changeOrigin: true,
pathRewrite: {
'^/api': '',
},
onProxyRes: function(proxyRes, req, res) {
proxyRes.headers['Access-Control-Allow-Origin'] = '*';
proxyRes.headers['Access-Control-Allow-Headers'] = 'Origin, Content-Type, Accept, Authorization, X-Requested-With';
proxyRes.headers['Access-Control-Allow-Methods'] = 'GET, POST, OPTIONS';
}
}));
app.listen(3000, () => {
console.log('Proxy server running on port 3000');
});
Browser-Specific Considerations
Some browsers handle CORS more strictly than others. Firefox, for instance, often enforces stricter CORS policies than Chrome. If your application works in Chrome but fails in Firefox, it’s likely due to header restrictions that need to be addressed in your CORS configuration.
In my testing, these differences became apparent when a user-agent
header was automatically added by Firefox but rejected by Ollama’s CORS configuration, while Chrome’s requests worked fine. Ensuring your OLLAMA_ORIGINS
setting is properly configured should resolve these browser-specific issues.
When Working with Docker
If you’re running Ollama in Docker, you can set the environment variables when starting the container:
docker run -d -p 11434:11434 -e OLLAMA_ORIGINS="*" ollama/ollama
For more complex setups, you can create a Docker Compose file:
version: '3'
services:
ollama:
image: ollama/ollama
ports:
- "11434:11434"
environment:
- OLLAMA_ORIGINS=*
volumes:
- ollama_data:/root/.ollama
volumes:
ollama_data:
Conclusion
CORS issues with Ollama are a common stumbling block, but they’re relatively straightforward to resolve once you understand the underlying mechanisms. By properly configuring the OLLAMA_ORIGINS
environment variable for your operating system, you can seamlessly integrate locally running Ollama instances with your web applications.
As local AI development continues to evolve, understanding and addressing these integration challenges becomes increasingly important. With these solutions in your toolkit, you can focus on building innovative AI applications rather than battling browser security policies.
Remember that while it might be tempting to use wildcard CORS settings (*
) during development, consider tightening these configurations in production environments to enhance security.
Have you encountered other integration challenges with Ollama or local LLMs? Share your experiences and solutions in the comments below!