I spent the last year building Charl - a programming language designed specifically for machine learning. Not a library on top of Python, but a language where tensors and autograd are native features.
Why?
PyTorch and TensorFlow are excellent, but they're libraries bolted onto general-purpose languages. I wanted to explore: what's possible when ML is built into the language itself?
What Works Today
- Native tensor operations
- Automatic differentiation (dynamic graphs)
- Neural network training (validated on MNIST)
- 22x faster than PyTorch CPU
- GPU support via wgpu
- Type-safe (static typing with inference)
Example: Training a Neural Network
// Network: 2 -> 4 -> 1
let w1 = tensor_randn([2, 4])
let w2 = tensor_randn([4, 1])
while epoch < 1000 {
// Forward
let h = nn_relu(nn_linear(x, w1, b1))
let pred = nn_sigmoid(nn_linear(h, w2, b2))
let loss = loss_mse(pred, target)
// Backward (automatic)
tensor_backward(loss)
// Update
w1 = optim_sgd_step(w1, tensor_grad(w1), 0.5)
w2 = optim_sgd_step(w2, tensor_grad(w2), 0.5)
}
Result: XOR converges from 0.259 → 0.006 loss (99% accuracy).
Current Limitations
- Alpha quality - expect bugs
- Small ecosystem
- Missing features (modules, generics)
- Not production-ready
This is a research experiment, not a PyTorch replacement.
Try It
- Website: https://charlbase.org
- GitHub: https://github.com/charlcoding-stack/charlcode
- Docs: Full API reference + 20+ examples
Looking For
- Testers (try it, break it, report issues)
- Feedback (what's confusing? what's missing?)
- Ideas (what would make this useful?)
Questions?
- Is the syntax intuitive for ML work?
- What's the first thing you tried that didn't work?
- What ML use case would make you actually use this?
Genuinely curious about feedback. This is an experiment - let's see where it goes.
Top comments (0)