Prerequisites
Before diving in, ensure you have the following:
- A macOS system (Ollama is macOS-specific).
- Homebrew installed for package management.
- Basic familiarity with Angular or JavaScript for frontend development.
- You can download Ollama for all operating systems from https://ollama.com/download.
Step 1: Installing Ollama
Install Ollama using Homebrew by running the following command:
brew install ollama
Once installed, start the Ollama service:
ollama serve
This launches the Ollama server locally, typically accessible at http://localhost:11434
.
Step 2: Installing the Model
To use the DeepSeek model, you first need to pull it into your local setup. Run the command:
ollama pull deepseek-r1:8b
This will download the model locally so it’s ready for use. If performance is a concern, consider pulling a smaller version of the model (e.g., deepseek-r1:3b
) if available.
Step 3: Testing the Model Locally
To ensure everything is working, test the model by initiating a chat session:
ollama chat deepseek-r1:8b
Type your prompts and see the responses in your terminal. Exit the session when you’re ready to move forward.
Step 4: Interacting with the Model via API
Ollama provides an API endpoint that allows programmatic interaction with installed models. Here’s how you can do it in Angular.
Setting Up Angular Project
- Install Angular CLI (if not already installed):
npm install -g @angular/cli
- Create a new Angular project:
ng new chat-app
-
Add HttpClientModule:
Update your
AppModule
to includeHttpClientModule
:
import { HttpClientModule } from '@angular/common/http';
@NgModule({
declarations: [AppComponent],
imports: [BrowserModule, HttpClientModule],
providers: [],
bootstrap: [AppComponent],
})
export class AppModule {}
Angular Service for Chat
Create a service to handle API communication:
import { HttpClient } from '@angular/common/http';
import { Injectable } from '@angular/core';
@Injectable({
providedIn: 'root',
})
export class ChatService {
url = 'http://localhost:11434/api/chat';
constructor(private http: HttpClient) {}
getChat(chatInput: string) {
const payload = {
model: 'deepseek-r1:8b',
messages: [
{ role: 'user', content: chatInput },
],
stream: false,
};
return this.http.post(this.url, payload);
}
}
Angular Component for Chat
Use the service in a component to interact with the API:
import { Component } from '@angular/core';
import { ChatService } from './chat.service';
@Component({
selector: 'app-root',
template: `
<div>
<input [(ngModel)]="chatInput" placeholder="Type your message" />
<button (click)="sendMessage()">Send</button>
</div>
<div *ngIf="chatResponse">
<p>{{ chatResponse }}</p>
</div>
`,
})
export class AppComponent {
chatInput = '';
chatResponse = '';
constructor(private chatService: ChatService) {}
sendMessage() {
this.chatService.getChat(this.chatInput).subscribe(
(response: any) => {
this.chatResponse = response.message.content; // Adjust as per API response structure
},
(error) => {
console.error('Error:', error);
}
);
}
}
Step 5: Optimizing Response Time
If you notice that responses take too long, here are some optimizations:
1. Use a Smaller Model
Switch to a smaller version of the model if available:
const payload = {
model: 'deepseek-r1:3b', // Smaller model for faster responses
messages: [
{ role: 'user', content: this.chatInput },
],
stream: false,
};
2. Add a System Message
Guide the model to produce concise responses:
const payload = {
model: 'deepseek-r1:8b',
messages: [
{ role: 'system', content: 'Respond concisely in plain text without Markdown or unnecessary tags.' },
{ role: 'user', content: this.chatInput },
],
stream: false,
};
3. Enable Streaming
Streaming allows partial responses to be sent as they’re generated:
const payload = {
model: 'deepseek-r1:8b',
messages: [
{ role: 'user', content: this.chatInput },
],
stream: true,
};
Handle the streamed response in chunks for faster perceived responses.
4. Optimize Server Performance
Ensure the server has sufficient resources (CPU, memory, or GPU). If necessary, scale up your system resources to handle the model efficiently.
Conclusion
In this guide, we explored how to set up and use Ollama's DeepSeek model with Angular. From installation to API integration and optimization, you now have a complete roadmap for building a functional chat application. By applying these techniques, you can ensure your application is both responsive and efficient.
Feel free to experiment with system prompts and smaller models to tailor the chat experience to your needs. Happy coding!
Top comments (0)