Hack2Math is a scientific calculator I conceived, designed and shipped
entirely on my own — from the first line of Swift to App Store approval. It's not just a calculator. It's a
structured mathematical workspace built around the way engineering students actually think: subject by subject,
with variables that carry meaning across operations.
The app exists because the tools available to students were either too simple to be useful
or too complex to be fast. I wanted something that could handle a mechanics problem, a circuit equation and a
calculus integral in the same session — without losing track of what each variable means. So I built it.
"I didn't ship it because it was finished. I shipped it because it was good enough
to be useful — and then kept making it better."
2+
Years live on App Store
100%
Solo development
iOS + macOS
Platforms supported
Key Features
Subject-based workspaces
Organize operations by discipline — mechanics, calculus, circuits. Every
session has context, not just a chain of numbers.
Dynamic variables
Define named variables and reuse them throughout a session. Change one value
and the entire dependency chain updates instantly.
Custom 2D graphing
Render any function directly from the expression you're working with. Built
from the ground up using SwiftUI Canvas — no third-party libraries.
macOS menu bar mode
A dedicated menu bar app for desktop — always one keystroke away, never
stealing focus from your workflow.
Built during my internship at Tornillos de Córdoba, this drone was
designed to solve a specific problem: how do you inspect warehouse structures, shelving and inventory in a
facility where GPS signal doesn't exist? Consumer drones are useless indoors. This one isn't.
The core technical challenge was positioning. Without GPS, the drone has no external
reference for where it is in space. I solved this by implementing optical flow with ArduCopter
— a downward-facing camera continuously reads ground texture and feeds velocity corrections to the flight
controller, while sonar handles altitude. The result is a stable, self-locating platform that can hover and
navigate inside a building with no external infrastructure required.
"GPS-denied navigation isn't a niche problem. Every warehouse, factory and data
center is GPS-denied. This drone was built for those environments."
0 GPS
Required for stable flight
30 FPS
Onboard vision pipeline
360°
Gimbal coverage
Technical Stack
ArduCopter + Optical Flow
Custom optical flow integration using a downward camera and ultrasonic sonar to
achieve stable hover and drift correction with no GPS input.
Raspberry Pi (onboard compute)
Embedded Linux running the full vision and telemetry pipeline onboard — no
ground station dependency for core flight operations.
Stabilized gimbal
2-axis electronically stabilized gimbal delivering clean inspection footage
even during maneuvering and turbulence from nearby equipment.
LiDAR integration platform
Custom-designed mechanical mount accommodating LiDAR sensors for future 3D
spatial mapping of facility environments.
03 / Embedded AI
Autonomous Drone & Vehicle
C++OpenCVFusion 360CNC HAASSensor
Fusion
🤖
Overview
A fully autonomous mission system combining an aerial drone and a ground vehicle. Once a
mission is initiated, there is no human in the loop. The system detects, tracks and navigates around obstacles
in real time using a C++ computer vision pipeline running at 30 FPS with OpenCV, fusing data
from IMU, sonar and optical sensors to maintain reliable state estimation throughout.
The physical chassis wasn't sourced or adapted — it was engineered from scratch. I
designed the full frame in Fusion 360 and machined it on a CNC HAAS with tolerances under ±0.01
mm. Working simultaneously on the software and the hardware that runs it changes how you think about both. You
stop treating them as separate problems.
"Writing the vision pipeline and then machining the frame that carries it — that
feedback loop is what real systems engineering looks like."
30 FPS
Real-time vision loop
±0.01mm
CNC machining tolerance
0
Human operators needed
Technical Stack
OpenCV computer vision
Real-time object detection and tracking pipeline written in C++, running at 30
FPS on embedded hardware without offloading to cloud compute.
Multi-sensor fusion
IMU, sonar and optical flow data fused to produce a stable state estimate in
dynamic, GPS-denied environments.
CNC HAAS manufacturing
Structural chassis designed in Fusion 360 and precision-machined on a HAAS CNC
mill. Every mounting point, joint and clearance was engineered for load and vibration.
Autonomous mission planner
State machine-based mission executor capable of real-time replanning when new
obstacles are detected mid-mission.
04 / Industrial
Yaskawa MH6
C++YMConnect SDKSiemens PLCOpenCVTIA
Portal
⚙️
Overview
Real-time trajectory control for the Yaskawa Motoman MH6 — a 6-axis
industrial manipulator with ±0.08 mm repeatability. The controller handles full trajectory interpolation,
collision detection and synchronization with a Siemens S7 PLC, enabling the robot to operate as
part of a hybrid automation cell rather than as a standalone unit.
The system was built in C++ using the YMConnect SDK and integrates a
vision pipeline for part detection and pose estimation. PLC logic and robot motion are synchronized through a
deterministic communication layer — a timing error of even a few milliseconds translates directly into
positional error at the end-effector.
"Industrial automation doesn't forgive timing errors. I learned to think in
microseconds and plan trajectories that respect both physics and schedule."
6-axis
Degrees of freedom
±0.08mm
End-effector repeatability
Hard RT
Control loop timing
Technical Stack
YMConnect SDK
Low-level robot control via Yaskawa's official SDK, with a custom trajectory
interpolation layer built on top for smooth, collision-aware motion.
Siemens PLC + TIA Portal
Ladder logic synchronized with the robot controller for coordinated
multi-device automation cell operation without race conditions.
OpenCV vision system
Camera-based part detection and 6-DOF pose estimation feeding directly into the
trajectory planner for adaptive pick-and-place operations.
Hard real-time control
Deterministic control loop with jitter profiling and compensation to maintain
sub-millisecond timing consistency across the entire cell.
05 / Hardware
Robotic Arm
C++PETG3D PrintingPID ControlSerial Protocols
🦾
Overview
A fully 3D-printed robotic arm manufactured in PETG and controlled
through a real-time C++ software layer. Every structural component was designed in CAD with load, flex and joint
clearance in mind — not just printed and hoped to work. The result is an arm that achieves submillimeter
positioning precision using custom PID controllers on each joint.
This project was a deliberate challenge to the assumption that precision requires
expensive machined metal. With the right material selection, mechanical design and control algorithms,
additive manufacturing can reach industrial-grade tolerances for light-load applications. The
arm demonstrated exactly that.
"PETG is not a compromise. It's a material choice — one that required understanding
its thermal limits, flex characteristics and fatigue behaviour before trusting it with precision work."
Sub-mm
Positioning precision
PETG
Primary material
PID
Per-joint control loop
Technical Stack
PETG printed structure
Full mechanical design in Fusion 360 optimized for FDM printing — wall
thicknesses, infill density and joint geometry tuned for stiffness under load.
C++ actuator controller
Independent PID loops for each joint running on embedded hardware, with manual
tuning per axis to account for different inertia and compliance.
Serial communication layer
High-speed, low-latency serial protocol between the compute node and servo
drivers, with packet framing to eliminate communication errors at speed.
Kinematics solver
Forward and inverse kinematics implemented from scratch for Cartesian-space
trajectory control — joint angles computed from target position, not looked up from a table.
Puebla, Mexico · Available June 2026
Developer Engineer Builder
I build autonomous systems, industrial robots and native apps from the ground up. My work
lives at the boundary between mechanics, electronics and software — where the hardest problems actually live.
I'm Juan Fernando Meza Rodríguez, a mechatronics engineer with a focus on autonomous
systems. I design, write code and manufacture — often all three in the same project, because the problems I
find interesting don't let you specialize.
My work sits at the intersection of hardware and software: drones that navigate indoors
without GPS, industrial robots that operate without an operator, apps that real students use to get through
real exams. Every project has a user, a constraint and a reason to exist.
"I'm not satisfied with making something work. I want to understand exactly why
it works — and what would have to change for it to fail."
Completing my Mechatronics Engineering degree in June 2026. Looking for roles where
technical depth and product impact are both taken seriously — not one at the expense of the other.
Tech stack
Languages
SwiftC++PythonC#JavaScript
Robotics & Embedded
OpenCVArduCopterSensor FusionLiDARRaspberry Pi
CAD & Manufacturing
Fusion 360SolidWorksCNC HAAS3D
Printing
Industrial Automation
Yaskawa MH6Siemens
PLCTIA Portal
Projects
What I've built.
From an app live on the App Store to drones flying themselves through industrial
facilities. Each project started with a real problem and ended with something that works.
01 / App Store
Hack2Math
A scientific calculator built for how engineers actually work — by subject, with
reusable variables and live 2D graphs. Designed, developed and published entirely on my own. Live on iOS
and macOS.
SwiftSwiftUIiOSmacOSApp Store
View full case study →
↗ Click to explore
∑
LIVE ON APP STORE
02 / Robotics
🚁
Inspection Drone
Industrial drone for structure and inventory inspection in GPS-denied environments.
Optical flow with ArduCopter for indoor hover stability, stabilized gimbal and LiDAR-ready platform running
on embedded Linux.
ArduCopterPythonOptical FlowLiDARRaspberry Pi
View full case study →
↗ Click to explore
03 / Embedded AI
🤖
Autonomous Drone & Vehicle
Fully autonomous mission system with 30 FPS computer vision, multi-sensor fusion and
zero human intervention in the loop. CNC-machined chassis at ±0.01 mm tolerance — designed and built from
scratch.
C++OpenCVFusion 360CNC
View full case study →
↗ Click to explore
04 / Industrial
⚙️
Yaskawa MH6
Hard real-time trajectory control for a 6-axis industrial robot with ±0.08 mm
repeatability. Integrated with a Siemens PLC and a vision pipeline for adaptive, camera-guided manipulation.
C++YMConnectSiemens PLCOpenCV
View full case study →
↗ Click to explore
05 / Hardware
🦾
Robotic Arm
Fully 3D-printed PETG structure with individual PID-controlled joints. Submillimeter
positioning via custom C++ kinematics solver. Proof that additive manufacturing can compete with machined
precision.
C++PETG3D PrintingPID
View full case study →
↗ Click to explore
Philosophy
How I think about engineering.
Good technology doesn't come from moving fast. It comes from understanding the problem
well enough to solve it once, correctly.
01
🔬
Precision over speed
Coming from a discipline where ±0.01 mm is the margin between functional and broken, I
apply that same standard everywhere. Fast code that's wrong is worse than slow code that's right. I prefer
to reason carefully and iterate deliberately.
02
🔗
Systems, not components
A drone is not just firmware. An app is not just UI. Every layer of a system has to
understand the constraints of every other layer. I design from the full stack down, not from individual
parts up.
03
🚀
Ship, then improve
Hack2Math is on the App Store. The drones have flown in real warehouses. I don't treat
prototypes as endpoints — I get working software and hardware into real environments as early as possible,
then let reality drive the next iteration.
04
🧩
Breadth as a technical asset
Knowing how to write a control loop, machine a chassis, and ship an app to the App
Store isn't a distraction from depth — it's what allows me to see problems that specialists in any single
domain would miss entirely.
Experience
Journey.
Where I've worked, what I built there and what it demanded
from me.
Feb 2025 — Dec 2025
Engineering Intern
Tornillos de Córdoba
Designed and built an autonomous inspection drone for use inside GPS-denied
industrial facilities. Implemented optical flow navigation with ArduCopter, integrated an onboard vision and
telemetry system, and engineered the mechanical platform for LiDAR sensor integration. First time taking a
robotics project from concept to deployment in a real industrial environment.
Feb 2022 — Apr 2024
Independent App Developer
Hack2Math
Designed, built and published Hack2Math on the App Store entirely on my own.
Full-stack Swift and SwiftUI development: product architecture, custom graphics engine, App Store submission
and ongoing maintenance across iOS and macOS. The app has been live for over two years.
Sep — Dec 2024
Industrial Robotics Project
Universidad · Puebla
Developed a real-time control and trajectory planning system for the Yaskawa Motoman
MH6 robot using the YMConnect SDK and C++. Integrated computer vision for adaptive part detection and
synchronized robot motion with Siemens PLC logic for coordinated automation cell operation.
Jun 2020 — Jun 2026
Mechatronics Engineering
Universidad · Puebla, Mexico
Six-year program covering mechanics, electronics, automatic control and embedded
software development. Applied coursework across every major project in this portfolio. Certifications in
SwiftUI development (Apple) and Siemens TIA Portal Associate Level 1.
Contact
Have a hard problem? Let's talk.
Open to engineering roles, technical
collaborations and projects that actually push something forward. If you're building something complex, I want
to hear about it.