Know How of MLOps ...
MLOps - Currently one of the hottest topics in the field of AI
Every Industry, Organisation, Research Team or Developer, somewhere is discussing
What is It?
What do they need?
Where to begin?
How to go about it?
Where to Implement it?
There are tons of questions floating around in the minds of AI enthusiasts all over the globe.
Even though there are loads of solutions available in the market that pave the path to MLOps, but one of the crucial components that they miss out on is explaining the Prerequisites for the same.
Before we dive into understanding what it takes to build an End-to-End MLOps solution, let's first try and answer some of the trivial questions.
What is MLOps?
To put it simply, it is a perfect blend between a Machine Learning Engineer and a DevOps Engineer
Well in another language it just stands for Machine Learning Operations. But what we also need to understand is What is Operations?
Operations in literal sense means to put into Action or in Effect. But keeping our context in mind, it means to setup the necessary components on Cloud Platforms and Activating them for User consumption.
In order to understand it better, we need to dive a little deeper into the Responsibilities encompassed by the umbrella of MLOps.

Responsibilities of MLOps Engineer
5 years ago knowing How to build a Machine Learning Model was more than sufficient to put your foot through the door. That's not the case anymore.
With the plethora of Frameworks and Tutorials floating around on the Internet, having the capability to develop ML Models has become a must.
This is where MLOps Engineer Stand Out in the crowd. Let's jot down some of the responsibilities under the role of MLOps -
- Developing Models
- Training and Evaluating Model results
- Packaging and Tagging Models
- Version control
- Deploying Models into Production
- Creating continuous Feedback to the Model
- Running All the Above Flawlessly in Loop
I know that looks scary but before jumping to conclusions, let's take a look at another fundamental question for the same.
Where to begin?
Well, the Journey to MLOps is not a quick-paced one. One has to go through multiple stages of Learning before attaining a complete understanding of the same.
But not to worry, this series of articles will take you through the various nuances involved with MLOps and unravel the mysteries and risks on your journey.
Let's not beat around the bush shall we, based on the topics mentioned above, we do realize that there are tons of content to cover.
Neural Modelling
Designing Neural Network architectures to fit your needs is of critical importance for AI-based solutions. In addition to that being said, there are tons of Open Source and Cloud-based solutions available to jumpstart your learning process at this level.



Packaging Models
Model packaging can be done in a variety of ways. Based on what the end goal is or How one wants to perform Training and Inferencing, the process of model packaging keeps varying.
The most standard approach for learning packaging is Docker. Along with that, there are other Cloud Native solutions that offer similar packaging for Edge or Wed applications.


Model Versioning
Once a model has been approved and packaged, it needs to be managed, implying that - the model version, metadata tagging, the hyper-parameters used, evaluation metrics and more need to be mapped and stored for every model release.
One of the leading Open Source solutions in this sector is MLflow by DataBricks, whereas there are powerful Cloud Native solutions for the same - For example, Vertex AI by Google Cloud Platform.



Deployment into Production
Once the models are packaged and tagged, now comes the industry requirement of pushing them into Production. Now based on your use case the way to Deploy a Model for Inference may vary.
For a Standardised approach, the leading Open Source platform to go to is Kubeflow, in addition to that, every Cloud has stable support for both REST and Edge-based deployment.

Conclusion
For someone aiming to kickstart their career in MLOps, this content may look overwhelming, but not to worry. As we progress in this series we will cover all the necessary components that would help you understand the big picture for an Enterprise-grade Solution.
I hope this article finds you well. STAY TUNED for more content. 😁