M.Sc. Robotics Student

Hi, I'm Majid Azizi

Robotics engineer passionate about autonomous systems, computer vision, and building intelligent machines that interact with the real world.

01 — About

Background & Education

I’m an M.Sc. student in Robotics Engineering with a strong focus on building real-world systems. I’m driven by the challenge of turning complex ideas into working hardware, from power distribution and embedded communication to intelligent control and machine learning.

My work spans embedded systems, power electronics, sensor integration, and applied computer vision. I enjoy architecting complete systems, designing PCBs, integrating distributed nodes, and debugging them until they’re robust and reliable.

What motivates me most is bridging hardware and software: taking a concept from schematic and code all the way to a functioning robotic platform. I’m particularly interested in system integration, intelligent automation, and applied AI in robotics.

🎓 Education

2021 — Present
M.Sc. Robotics
Mälardalen University

💼 Experience

2019 – 2020
Customer Support Manager
Techbuddy
2017 – 2019
Sales Associate
Elgiganten
02 — Skills

Technical Toolkit

🤖 Robotics & Embedded Systems

ROS / ROS2 Kinematics FreeRTOS I2C CAN

💻 Programming

Python C/C++ MATLAB C# VHDL

👁️ Vision & ML

OpenCV PyTorch TensorFlow YOLO

🔩 Hardware & Tools

Raspberry Pi Arduino ESP Fusion 360 SolidWorks 3D Printing Git Linux
03 — Projects

Selected Work

🦾

Intent-Driven Multimodal Humanoid Arm

This project focused on designing and integrating the full electrical and communication backbone of a bionic humanoid arm capable of intent-driven and adaptive grasping. The system combined EMG-based intent detection, computer vision, and distributed embedded control to enable intelligent manipulation. I designed and implemented a four-rail power distribution unit (12V, 7.4V, 6V, 5V) supporting ten actuators and five distributed ESP32 nodes, and helped architect a CAN-bus communication network connected to a Jetson AGX Orin for high-level processing. A major part of the work involved system-level integration, power integrity, and debugging physical-layer communication issues, including root-cause analysis of failures related to power-domain interactions and reverse current propagation. This project provided hands-on experience in real robotic system architecture, embedded communication, and hardware reliability engineering.

Embedded Systems CAN Bus ESP32 Jetson Orin Power Electronics KiCad EMG Sensors Computer Vision System Integration Hardware Debugging
🧤

Low-Cost Sign Language Interpretation Glove

The goal of this project was to develop an affordable wearable glove capable of recognizing American Sign Language (ASL) alphabet gestures using low-cost sensing alternatives to traditional flex sensors. I built the full physical prototype, including soldering, sensor assembly, and wearable integration. The glove combined LED–photodiode bend sensors, Hall effect sensors, and an accelerometer to capture finger motion and hand orientation. Throughout development, I supported iterative hardware debugging and optimization under tight budget and time constraints. The result was a functional, low-cost sensing system demonstrating how accessible hardware can be used for gesture recognition and assistive technologies.

Wearable Sensors Embedded Systems Arduino Signal Conditioning Hall Effect Sensors Custom Hardware Prototyping Low-Cost Design Hardware Debugging
🧠

Facial Emotion Recognition for Elderly

This project investigated age-related bias in facial emotion recognition (FER) systems and explored training strategies to improve performance on elderly faces. We merged three datasets (FACES, RAF-DBt, and Tsinghua) into a unified dataset of 16,399 images and structured it into training (70%), validation (15%), and test (15%) splits. Through diverse dataset fine-tuning and iterative experimentation, I helped improve validation accuracy from a 75.15% baseline to 97.6%. I also set up the full machine learning workflow for the Poster++ model, including data preprocessing, training pipelines, and experiment tracking, enabling reliable evaluation and reproducibility across the team. This work highlighted the importance of dataset diversity in building fair and robust AI systems.

Machine Learning Computer Vision Python Dataset Engineering Model Training Bias Mitigation Experiment Tracking Data Preprocessing Performance Optimization
04 — Contact

Let's Connect

I'm always interested in discussing robotics, collaborating on projects, or exploring opportunities. Feel free to reach out!