The Architectural Evolution (2006 — Present)

I view my career not just as a list of jobs, but as a series of architectural shifts. Each era represents a new challenge in how we handle data and scale.

2026 - AI-Data Interoperability (MCP)

Scaling the Model Context Protocol (MCP) to bridge the gap between LLMs and secure Data Lakes, enabling AI agents to interact with Petabyte-scale datasets safely.

MCP AI Agents Semantic Data Layers
2024 - Lakehouse & Modern Data Stack

Bridging the gap between Data Warehouses and Data Lakes. Implementing open table formats for high-performance ACID transactions on object storage.

Apache IcebergKafka Streaming Dask Data Governance
2022 - Data Engineering & Streaming

Pivoted to real-time event-driven architectures and distributed computing to handle high-velocity streaming and Petabyte-scale challenges.

Active MQ Spark Hadoop Kubernetes GoAirflow
2018 - AI & Machine Learning

Integrated predictive power into applications, moving from standard logic to Deep Learning and Graph-based relationship modeling.

Python CNN GNN Neural Networks PyTorch/TF
2015 - Microservices

Led the migration from modular application to microservices.

Spring Boot Microservices
2012 - Cloud Transformation

Led the migration from on-premise hardware to elastic cloud environments, optimizing for high availability and global scale.

AWS / Google Cloud SQL / NoSQL
2010 - Enterprise Modularization

Transitioned from rigid monoliths to modular "Shared Nothing" architectures. Focused on decoupled data access and the early Spring ecosystem to improve team velocity.

Spring Framework Modular Monoliths Hibernate BPELWebservices
2008 - Enterprise Frameworks

Adopted the Spring ecosystem to handle complex enterprise requirements and "Rich Internet Application" (RIA) frontends.

Spring Framework Struts Hibernate Flex/ActionScript
2006 - Monolithic Era

The foundation: Building robust, server-side rendered web applications with core Java and relational database design.

Java EEEJB JSP Servlets RDBMS(Oracle/MySQL)

Perspective: Coming from an Electrical and Electronics Engineering background, I’ve always viewed software through the lens of signal processing and circuit efficiency. This “low-level” understanding helps me optimize Big Data pipelines where every byte and millisecond counts.