<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: PE</title>
    <description>The latest articles on DEV Community by PE (@peshwar).</description>
    <link>https://dev.to/peshwar</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/peshwar"/>
    <language>en</language>
    <item>
      <title>Why I Built mlship: One Tool for All ML Frameworks</title>
      <dc:creator>PE</dc:creator>
      <pubDate>Tue, 06 Jan 2026 16:22:30 +0000</pubDate>
      <link>https://dev.to/peshwar/why-i-built-mlship-one-tool-for-all-ml-frameworks-222k</link>
      <guid>https://dev.to/peshwar/why-i-built-mlship-one-tool-for-all-ml-frameworks-222k</guid>
      <description>&lt;p&gt;If you're learning machine learning, you've probably noticed something frustrating: every framework has its own way to serve models.&lt;/p&gt;

&lt;p&gt;Trained a scikit-learn model? Use Flask or FastAPI and write your own server code. Built a PyTorch model? Maybe try TorchServe (if you can figure out the configuration). TensorFlow? TF Serving with Docker. HuggingFace? There's transformers-serve, but it only works for transformer models.&lt;/p&gt;

&lt;p&gt;For students and data scientists who work across frameworks, this fragmentation is exhausting. You spend more time learning deployment tools than actually deploying models.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What if one command worked for everything?&lt;/strong&gt;&lt;br&gt;
That's why I built &lt;em&gt;mlship&lt;/em&gt;. It's a zero-configuration CLI that turns any ML model into a REST API with a single command:&lt;/p&gt;

&lt;p&gt;mlship serve model.pkl&lt;/p&gt;

&lt;p&gt;That's it. No Docker. No YAML. No framework-specific configuration. It works for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;scikit-learn (.pkl, .joblib)&lt;/li&gt;
&lt;li&gt;PyTorch (.pt, .pth with TorchScript)&lt;/li&gt;
&lt;li&gt;TensorFlow (.h5, .keras, SavedModel)&lt;/li&gt;
&lt;li&gt;HuggingFace models (local or directly from the Hub)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Example: Serving a HuggingFace model from the Hub. You don't even need to download the model file&lt;/p&gt;

&lt;h3&gt;
  
  
  Install
&lt;/h3&gt;

&lt;p&gt;pip install mlship[huggingface]&lt;/p&gt;

&lt;h3&gt;
  
  
  Serve a sentiment analysis model
&lt;/h3&gt;

&lt;p&gt;mlship serve distilbert-base-uncased-finetuned-sst-2-english --source huggingface&lt;/p&gt;

&lt;h3&gt;
  
  
  Test it
&lt;/h3&gt;

&lt;p&gt;curl -X POST &lt;a href="http://localhost:8000/predict" rel="noopener noreferrer"&gt;http://localhost:8000/predict&lt;/a&gt; \&lt;br&gt;
  -H "Content-Type: application/json" \&lt;br&gt;
  -d '{"features": "This product is amazing!"}'&lt;br&gt;
You get an auto-generated REST API with:&lt;/p&gt;

&lt;p&gt;/predict - Make predictions&lt;br&gt;
/health - Health check&lt;br&gt;
/info - Model metadata&lt;br&gt;
/docs - Interactive Swagger UI&lt;/p&gt;

&lt;h3&gt;
  
  
  Why mlship matters
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;For students&lt;/em&gt;: Learn model serving concepts once, use them across your entire ML curriculum. Stop wrestling with framework-specific tools when you should be learning ML.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;For data scientists&lt;/em&gt;: Prototype locally without Docker or cloud setup. Test your models with realistic API interactions before investing in production infrastructure.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;For educators&lt;/em&gt;: Teach framework-agnostic concepts. Your students can focus on ML fundamentals instead of deployment tooling.&lt;/p&gt;

&lt;h3&gt;
  
  
  What's different?
&lt;/h3&gt;

&lt;p&gt;Unlike BentoML (requires Python code), TorchServe (PyTorch only), TF Serving (TensorFlow only), or transformers-serve (HuggingFace only), mlship is the only zero-code tool that supports all major frameworks.&lt;/p&gt;

&lt;p&gt;It's deliberately simple:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;No configuration files&lt;/li&gt;
&lt;li&gt;No custom Python code required&lt;/li&gt;
&lt;li&gt;Works offline after installation&lt;/li&gt;
&lt;li&gt;Local-first (no cloud dependency)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Think of it as "one tool for your entire ML journey" - from your first scikit-learn classifier to production-grade transformers.&lt;/p&gt;

&lt;p&gt;Try it yourself:&lt;/p&gt;

&lt;p&gt;pip install mlship&lt;br&gt;
mlship serve your_model.pkl&lt;/p&gt;

&lt;p&gt;Full examples in the Quick Start Guide.&lt;/p&gt;

&lt;h3&gt;
  
  
  We need your help
&lt;/h3&gt;

&lt;p&gt;mlship is far from perfect - it's a young project with rough edges. But that's exactly why we need your help to make it better.&lt;/p&gt;

&lt;p&gt;We're looking for contributors:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Support for more frameworks (XGBoost, LightGBM)&lt;/li&gt;
&lt;li&gt;More HuggingFace task types (Q&amp;amp;A, translation)&lt;/li&gt;
&lt;li&gt;GPU support&lt;/li&gt;
&lt;li&gt;Bug fixes and improvements&lt;/li&gt;
&lt;li&gt;Documentation improvements&lt;/li&gt;
&lt;li&gt;Your ideas!&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Found a bug? Please open an issue - we want to fix it.&lt;/p&gt;

&lt;p&gt;Have a feature idea? Open an issue and let's discuss it.&lt;/p&gt;

&lt;p&gt;Want to contribute code? Check out CONTRIBUTING.md to get started.&lt;/p&gt;

&lt;p&gt;Read the comparison with other tools: WHY_MLSHIP.md&lt;/p&gt;

&lt;p&gt;GitHub: &lt;a href="https://github.com/sudhanvalabs/mlship" rel="noopener noreferrer"&gt;https://github.com/sudhanvalabs/mlship&lt;/a&gt; PyPI: &lt;a href="https://pypi.org/project/mlship/" rel="noopener noreferrer"&gt;https://pypi.org/project/mlship/&lt;/a&gt; License: MIT&lt;/p&gt;

&lt;p&gt;Let's make ML model serving simple for everyone.&lt;/p&gt;

</description>
      <category>machinelearning</category>
      <category>devtools</category>
      <category>python</category>
      <category>opensource</category>
    </item>
  </channel>
</rss>
