MLOps Building Blocks: Chapter 1 - FastAPI

MLOps Sep 8, 2021

As a fellow AI Engineer coming from a core AI background where most of the time is spent in reading papers, researching or developing architectures, I personally found it really difficult to transition into enterprise solutions which expect a lot more.

The recent term MLOps has taken the AI community by storm.

In our previous posts we covered about What is MLOps? What are the different branches of MLOps.

What most articles fail to cover is How To Begin?

It is easy to say, Go Read the Documentation and these are the tools to be used. What is more difficult is to guide them on the path that would actually help someone reach their destination.

Then again before we even venture into the How aspect of it, we should ask ourselves Why even do it?

Why API and AI?

But that doesn't quite answer my question now does it.

So let's try and have an intuitive understanding.

Most AI models that engineers build are on their local systems. But the moment you want some other developer or user to leverage your hard-work, What do you think would be the most idealistic way to go about it?

Now assuming that the user has the necessary environment set, only then will they be able to run the model.

That sounds like a very tedious task, especially expecting an End User to have a development environment set is impractical.

This is the very reason why Enterprise's spend so much time in building the most suitable and easy to use UI for their customers.

Now the question is how would such a UI or any other application use our model. As they themselves have a heavy code base.

This is where RestAPI comes into the picture.
If one is able to serve their model over an API End Point, it makes leveraging your models in multiple applications a piece of cake and well ofcourse you get a lot more credit this way.

FastAPI is that tool which helps you build APIs intuitively and at blazing fast speeds.

FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints.

Advantages of FastAPI

The key features are:

  • Fast: Very high performance, on par with NodeJS and Go (thanks to Starlette and Pydantic). One of the fastest Python frameworks available.
  • Fast to code: Increase the speed to develop features by about 200% to 300%. *
  • Fewer bugs: Reduce about 40% of human (developer) induced errors. *
  • Intuitive: Great editor support. Completion everywhere. Less time debugging.
  • Easy: Designed to be easy to use and learn. Less time reading docs.
  • Short: Minimize code duplication. Multiple features from each parameter declaration. Fewer bugs.
  • Robust: Get production-ready code. With automatic interactive documentation.
  • Standards-based: Based on (and fully compatible with) the open standards for APIs: OpenAPI (previously known as Swagger) and JSON Schema.

Let's not beat around the bush for too long. How about we go ahead and build a dummy script just to explore how easy it is.

Introduction to FastAPI

As we already know what FastAPI, we are now going to build just a small dummy API to understand the ease of use of FastAPI.

Step 0: Install the Libraries

pip3 install uvicorn
pip3 install fastapi

Step 1: Import the necessary Packages

from fastapi import FastAPI
import uvicorn

Step 2: Build your API

# Declare a FastAPI instance
app = FastAPI()

# Create a POST type of router with a URL end point name of your choice
@app.post("/test_router")
def test_function():
    return {"Success": "I did it"}


# Run the script
if __name__ == "__main__":
    uvicorn.run(app)

Once you run the above Python script, it would open your API Documentation on port 8080.

You can go ahead and open a web browser of your choice and open the below mentioned URL - http://127.0.0.1:8080/docs

Step 3: Test your APIs

Once you open your browser you would see your API documentation. Feel free to test it out as mentioned below.

Conclusion

Congratulations! you have successfully built your first API end point in a matter of minutes.

All we got left to do is take this up a notch a see what are the different methods by which one can serve their model as API End points.

In our future posts we will be covering the various approaches to reaching Enterprise grade solutions for AI models.

You can find the complete Code on Github right below.

AI-kosh/mlops at main ยท Chronicles-of-AI/AI-kosh
Archives of blogs on Chronicles of AI. Contribute to Chronicles-of-AI/AI-kosh development by creating an account on GitHub.

STAY TUNED for more on MLOps. ๐Ÿ˜

Tags

Vaibhav Satpathy

AI Enthusiast and Explorer

Great! You've successfully subscribed.
Great! Next, complete checkout for full access.
Welcome back! You've successfully signed in.
Success! Your account is fully activated, you now have access to all content.