DEV Community

Mayuresh Smita Suresh
Mayuresh Smita Suresh Subscriber

Posted on

Mathematical Optimisation in Rust: A Complete Guide to good_lp + HiGHS (Production Ready with Axum)

Modern backend systems often need to make optimal decisions under
constraints
.

Examples:

  • Allocate limited resources
  • Minimize operational cost
  • Select optimal product mix
  • Plan logistics efficiently
  • Build smart pricing engines

This is where mathematical optimization becomes powerful.

In Rust, two tools make this practical:

  • good_lp → Modeling layer (DSL for optimization problems)
  • HiGHS → High-performance optimization solver

This guide covers:

  • Mathematical foundations
  • What good_lp and HiGHS are
  • When to use them
  • Rust examples
  • Production-ready Axum backend integration
  • Scaling considerations

1. Mathematical Optimization (Concept)

Optimization problems look like this:

Minimize or Maximize:

f(x)
Enter fullscreen mode Exit fullscreen mode

Subject to:

g1(x) ≤ b1
g2(x) = b2
x ∈ feasible set
Enter fullscreen mode Exit fullscreen mode

Where:

  • x = decision variables
  • f(x) = objective function
  • g(x) = constraints

2. LP vs MIP

Linear Programming (LP)

All expressions are linear.

Example:

Maximize: 3x + 2y

Subject to: x + y ≤ 10 x ≥ 0 y ≥ 0

LP problems are solved efficiently using simplex or interior-point
methods.


Mixed Integer Programming (MIP)

Some variables must be integers or binary.

Example:

x ∈ {0,1}
Enter fullscreen mode Exit fullscreen mode

MIP is NP-hard.

Solvers use: - LP relaxation - Branch-and-bound - Cutting planes


3. What Is HiGHS?

HiGHS is a high-performance open-source solver written in C++.

It supports:

  • Linear Programming (LP)
  • Mixed Integer Programming (MIP)
  • Quadratic Programming (QP)

HiGHS is the computation engine.


4. What Is good_lp?

good_lp is a Rust modeling crate.

It: - Builds constraint matrices - Translates to solver format - Calls
backend solver (like HiGHS)

It does not solve problems itself.

Architecture:

Rust Code → good_lp → HiGHS → Optimal Solution


5. Installation

Cargo.toml:

[dependencies]
good_lp = "1.4"
highs = "0.8"
axum = "0.7"
tokio = { version = "1", features = ["full"] }
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
Enter fullscreen mode Exit fullscreen mode

6. Basic Optimization Example

Maximize: 3x + 2y

Subject to: x + y ≤ 10

use good_lp::{variables, variable, SolverModel, default_solver};

fn solve_lp() -> Result<(), Box<dyn std::error::Error>> {
    let mut vars = variables!();
    let x = vars.add(variable().min(0.0));
    let y = vars.add(variable().min(0.0));

    let solution = vars
        .maximise(3.0 * x + 2.0 * y)
        .using(default_solver)
        .with(x + y <= 10.0)
        .solve()?;

    println!("x = {}", solution.value(x));
    println!("y = {}", solution.value(y));

    Ok(())
}
Enter fullscreen mode Exit fullscreen mode

7. Production-Ready Axum Backend Example

We now build a simple optimization API.

It solves:

Maximize: profit_a * A + profit_b * B

Subject to: A + B ≤ limit A, B ∈ {0,1}


main.rs

use axum::{routing::post, Router, Json};
use serde::{Deserialize, Serialize};
use good_lp::{variables, variable, SolverModel, default_solver};
use std::net::SocketAddr;

#[derive(Deserialize)]
struct OptimizeRequest {
    profit_a: f64,
    profit_b: f64,
    limit: f64,
}

#[derive(Serialize)]
struct OptimizeResponse {
    a: f64,
    b: f64,
    objective: f64,
}

async fn optimize(Json(payload): Json<OptimizeRequest>) -> Json<OptimizeResponse> {
    let mut vars = variables!();

    let a = vars.add(variable().binary());
    let b = vars.add(variable().binary());

    let solution = vars
        .maximise(payload.profit_a * a + payload.profit_b * b)
        .using(default_solver)
        .with(a + b <= payload.limit)
        .solve()
        .unwrap();

    let response = OptimizeResponse {
        a: solution.value(a),
        b: solution.value(b),
        objective: solution.eval(payload.profit_a * a + payload.profit_b * b),
    };

    Json(response)
}

#[tokio::main]
async fn main() {
    let app = Router::new()
        .route("/optimize", post(optimize));

    let addr = SocketAddr::from(([127, 0, 0, 1], 3000));
    println!("Server running at http://{}", addr);

    axum::Server::bind(&addr)
        .serve(app.into_make_service())
        .await
        .unwrap();
}
Enter fullscreen mode Exit fullscreen mode

8. Running the API

Start server:

cargo run
Enter fullscreen mode Exit fullscreen mode

Call endpoint:

POST http://localhost:3000/optimize

Body:

{ "profit_a": 10, "profit_b": 6, "limit": 1 }

Response:

{ "a": 1, "b": 0, "objective": 10 }


9. Production Considerations

  • Set solver time limits
  • Handle infeasible models gracefully
  • Log solver status
  • Separate optimization into service layer
  • Consider async worker if solve time is long

10. Scaling

To scale:

  • Partition large problems
  • Avoid symmetry
  • Use soft constraints carefully
  • Monitor solver time
  • Use time limits for MIP

Final Thoughts

Optimization is a powerful backend capability.

good_lp provides clean modeling. HiGHS provides industrial-grade
solving. Axum provides modern Rust web infrastructure.

Together, they form a production-ready optimization backend stack.

If your system needs:

Best possible decision under constraints
Enter fullscreen mode Exit fullscreen mode

Then mathematical optimization in Rust is the correct architectural
choice.

Please ask me any questions you have in your mind I’ll be happy to guide you through

Thanks,
Mayuresh
About me

Top comments (0)