DEV Community

Cover image for Are you using OpenAI API? Then you need to be prepared!
skywarth
skywarth

Posted on

Are you using OpenAI API? Then you need to be prepared!

If you are dependent on OpenAI API, you should know that at some point in time it might go down, even for a short period. And in such cases you would like to know, act accordingly or maybe even run certain automations to mitigate the problem at hand. Those that do monitoring on their external dependencies know exactly what I'm talking about.

This article is for those:

  • Use OpenAI API in their apps, or have dependency on it directly/indirectly
  • Prometheus Server and Blackbox exporter users
  • Grafana, metrics and visualization enjoyers

Monitoring the status of APIs is crucial for maintaining the health and reliability of your applications. For those using OpenAI's API, the official status API always returns a 200 status code, even when the service is down. And naturally, this prevents you from probing this API via Prometheus Blackbox exporter.

This is where the OpenAI API Status Prober comes in handy. It acts as a proxy, translating the status into meaningful HTTP codes that integrate seamlessly with your Prometheus setup.

Image description

Image description

Key Features

  • Accurate Status Reporting: Converts OpenAI's status API responses into proper HTTP codes (200/500/3xx).
  • Easy Integration: Simplifies the process of integrating OpenAI API status monitoring into Prometheus.
  • Flexible Installation Options: Supports global, local, and direct usage methods.

Why Use OpenAI API Status Prober?

The primary motivation for using this tool is the limitation of the official OpenAI status API. By providing a proxy that returns appropriate HTTP status codes, the prober makes it possible to integrate OpenAI's status into Prometheus, enhancing your monitoring capabilities.

Usage

Installation

You can install and set up OpenAI API Status Prober using three methods:

1.Global Installation:

npm install -g pm2
npm install -g openai-api-status-prober
openai-api-status-prober start
pm2 startup
pm2 save
Enter fullscreen mode Exit fullscreen mode

2.Local installation:

git clone https://github.com/skywarth/openai-api-status-prober.git
cd openai-api-status-prober
npm ci
node src/server.js
Enter fullscreen mode Exit fullscreen mode

3.Direct Usage of Production Deployment:

You can use the deployment directly via https://openai-api-status-prober.onrender.com/open-ai-status-prober/simplified_status. However, it's recommended to self-host to avoid overloading the service.

Integrating into Prometheus Blackbox exporter

scrape_configs:
  - job_name: 'blackbox'
    metrics_path: /probe
    params:
      module: [http_2xx]
    static_configs:
      - targets:
        - http://127.0.0.1:9091/open-ai-status-prober/simplified_status
    relabel_configs:
      - source_labels: [__address__]
        target_label: __param_target
      - source_labels: [__param_target]
        target_label: instance
      - target_label: __address__
        replacement: 127.0.0.1:9115
Enter fullscreen mode Exit fullscreen mode

Then run systemctl restart prometheus

CLI Commands

  • Start Server: openai-api-status-prober start
  • Stop Server: openai-api-status-prober stop
  • Version: openai-api-status-prober -v
  • Env Path: openai-api-status-prober env-path

Repository: https://github.com/skywarth/openai-api-status-prober
Deployment: https://openai-api-status-prober.onrender.com/open-ai-status-prober/simplified_status

Image of AssemblyAI

Automatic Speech Recognition with AssemblyAI

Experience near-human accuracy, low-latency performance, and advanced Speech AI capabilities with AssemblyAI's Speech-to-Text API. Sign up today and get $50 in API credit. No credit card required.

Try the API

Top comments (0)

A Workflow Copilot. Tailored to You.

Pieces.app image

Our desktop app, with its intelligent copilot, streamlines coding by generating snippets, extracting code from screenshots, and accelerating problem-solving.

Read the docs

👋 Kindness is contagious

Discover a treasure trove of wisdom within this insightful piece, highly respected in the nurturing DEV Community enviroment. Developers, whether novice or expert, are encouraged to participate and add to our shared knowledge basin.

A simple "thank you" can illuminate someone's day. Express your appreciation in the comments section!

On DEV, sharing ideas smoothens our journey and strengthens our community ties. Learn something useful? Offering a quick thanks to the author is deeply appreciated.

Okay