aziddddd
Azid Harun
Data Engineer II at AirAsia
Kuala Lumpur, Malaysia

A pure physicist who turns data into stories and pipeline into factories.

CodersRank Score

What is this?

This represents your current experience. It calculates by analyzing your connected repositories. By measuring your skills by your code, we are creating the ranking, so you can know how good are you comparing to another developers and what you have to improve to be better

Information on how to increase score and ranking details you can find in this blog post.

401
CodersRank Rank
Top 3%
Top 5
HCL
HCL
Developer
Malaysia
Top 5
Jupyter Notebook
Jupyter Notebook
Developer
Malaysia
Top 5
Python
Python
Developer
Malaysia
Highest experience points: 0 points,

0 activities in the last year

List your work history, including any contracts or internships
AirAsia
May 2022 - Present (6 months)
Federal Territory of Kuala Lumpur, Malaysia Current workspace
Currently Azid Harun supports the AirAsia

Azid Harun's scores will be added to this company.

Data Engineer II
• Design a Serverless E2E MLOps Framework Template for forecast and classification use cases on rapid AI project productionalization, fully integrated with CICD. Tools involved: GitLab, Docker, Pytest, Google Vertex AI Pipeline, BigQuery, Cloud Function, Pub/Sub, Cloud Storage, Airflow Generator DAG.
• Design a generalized Docker image to pull data from any API. Tools involved: GitLab, Docker
• Built Airflow ETL Pipelines. Tools involved: GitLab, Airflow
Maxis
May 2021 - Apr 2022 (11 months)
Kuala Lumpur, Malaysia
MLOps Engineer
Program and Processes
• Design and lead a MLOps Standardization Program for entire MLOps team to streamlining the method on productionalizing an AI project.
• Design and lead in building a centralized performance dashboard for all AI projects in terms of business, model and pipeline health. Tools involved: DataStudio, Google BigQuery, Cloud Logging, Pub/Sub, Cloud Function, Bitbucket.
• Revolutionalised team's MLOps ecosystem by building Maxis-specialized and standardized Machine Learning Pipeline, fully integrated with CICD. Tools involved: Bitbucket, Docker, Pytest, Google Vertex AI Pipeline, Secret Manager, BigQuery, Cloud Function, Cloud Scheduler, Pub/Sub, Cloud Storage.
• Built a Git repository as an easy-to-use toolkit for the commonly used DS processes with the implementation of unit testing. Tools involved: Bitbucket, Docker, Pytest.
• Acted as a panelist in Pakar STEM session of Maxis's flagship community program, Maxis eKelas to encourage and support the younger generation who plan on choosing STEM pathway as their own career path.
• Acted as an accessor for Maxis Scholarship Program 2021 to support Maxis Young Talent team in accessing potential scholarship candidates.

Productionalization
• Built a churn prediction pipeline for prepaid users. Tools involved: Refer to standardized pipeline tools.
• Built a churn prediction pipeline for postpaid users. Tools involved: Refer to standardized pipeline tools.
• Built AWS recommender system pipeline for recommending best plans on Hotlink App according to user behaviour and personalisation. Tools involved: AWS SageMaker, AWS Lambda, AWS S3, AWS Glue, AWS EC2, AWS Athena, AWS Quicksight, AWS CloudWatch, AWS EventBridge, AWS SSM.

Innovation
• Drive and involved in building a resume search app that optimize the HR recruitment process. Tools involved: Elasticsearch, Google Cloud Storage, App Engine Flex, Streamlit.
google cloud aws
DKSH
Oct 2020 - Apr 2021 (6 months)
Kuala Lumpur, Malaysia
Data Scientist
• Automated a customer segmentation pipeline using Azure resources. Tool involved: Azure Synapse, Azure Blob Storage, Azure Databricks, Azure Data Factory, Azure SQL.
• Built a fraud detection pipeline for detecting anomaly transactions & claims for Finance teams. Tools involved: t-SNE, Autoencoder with PyTorch Lightning, Azure Databricks, Azure Blob Storage, Power BI.
• Built a general forecasting pipeline for univariate forecasting. Tools involved: DARTS, Azure Databricks, Azure Blob Storage, Power BI.
• Built a sentiment analysis pipeline for to turn the projects feedback into actionable insights. Tools involved: HuggingFace, Azure Databricks, Azure Blob Storage, Power BI.
• Built a Git repository as an easy-to-use tools for the commonly used DS processes with the implementation of unit testing. Tools involved: GitHub, Docker, Pytest.

Add some compelling projects here to demonstrate your experience
Phase Diagram of a Leonnard Jones System(Argon) by Molecular Dynamics Simulations in Python
Feb 2018 - Mar 2018
This is the visualization of the phase of Argon particles. The evolution of the system which interacting through the Lennard-Jones pair potential, was first simulated using velocity Verlet time integration algorithm, while obeying the minimum image convention and periodic boundary conditions algorithms in order to be able to be visualized correctly within the simulation box, with the particle positions and velocities being initialized by another python file containing utilities for molecular dynamics. An .xyz trajectory file for the simulation is written out for visualization process using VMD.
Gamma Ray Spectroscopy : Positron Annihilation
Jan 2018 - Jan 2018
A gamma-ray spectrometer that consists of two NaI(Tl) scintillation detector, electronic
modules (4890, 416A, 418A, 427A and 772) and EASY-MCA-2K has been used in this
experiment in which it was divided into two parts, where the first is the Inverse Square Law,
and the second one is the Angular Correlation. The Inverse Square Law and Angular
Correlation measured the displacement and angular variation of the annihilation gammagamma
coincidence counting rate, respectively. As a result, the intensity of the gamma rays is
verified to be inversely proportional to the square of the distance from 22Na source by the
Inverse Square Law part. From the Angular Correlation part, the intensity of the gamma rays
was seen to exhibits the Dirac-delta Gaussian, 𝑊(𝜃) = 𝛿(𝜃 − π) when plotted with the
angle around 180°.
Data Generator and Maximum Likelihood Fitting
Oct 2018 - Dec 2018
A set of 10,000 random events with a distribution of decay time and angle are generated using box method. A maximum likelihood fit are performed twice where the first one with only decay time information and the other with both decay time and angle information. The best fit values of the physics parameter, F, τ1 and τ2 together with proper errors for the first maximum likelihood fit are determined to be 1.0±0.0,0.0±0.0 and 0.0±0.0, respectively. In the case for the second maximum likelihood, the parameters F, τ1 and τ2 together with proper errors are determined to be 0.5459(32),1.3952(93)μs and 2.7088(215)μs, respectively.

This section lets you add any degrees or diplomas you have earned.
University of Edinburgh
Bachelor of Science in Physics, Physics
Sep 2015 - Jul 2019
University of Malaya
Foundation in Science, Physical Sciences
Jun 2014 - Apr 2015
Blockchain: Beyond the Basics
Sep 2018
Artic Code Vault Contributor
Ethereum: Building Blockchain Decentralized Apps (DApps)
Sep 2018

Jobs for you

Show all jobs
Feedback