End-to-End AI Research Journey

Models

Fields

Mechanics

The Objective

This journal documents my end-to-end learning journey across Artificial Intelligence. The goal is to build a profound understanding of neural network architectures, from fundamental Multi-Layer Perceptrons to state-of-the-art Large Language Models.

Machine learning is evolving from academic theory into the foundational infrastructure of software. This website serves as a public index of my research, mathematical derivations, and code implementations.

Rather than treating models as black boxes, my approach focuses on understanding the underlying mechanics. This includes studying the optimization functions in Deep Learning (DL) , exploring how Small Language Models (SLM) can rival larger counterparts through data quality, and dissecting the statistics behind Natural Language Processing (NLP) .

By implementing these systems, leveraging techniques like Retrieval-Augmented Generation (RAG) , and analyzing research papers, I am bridging the gap between theoretical concepts and practical deployments.

My latest findings, implementation notes, and insights are chronologically documented below:

Concrete AI projects
delivered in production

Vinci Construction

"Integrated LLM capabilities directly into SAP CAP applications via SAP AI Core, enabling context-aware responses grounded in live business data without leaving the existing ERP workflow."

"Developed a Joule Skill for automated Purchase Order creation within the Vinci Construction environment, reducing manual entry time and surfacing AI-assisted suggestions at the point of procurement."

"Designed and shipped a Joule Skill for Sales Order creation, allowing sales teams to initiate orders through conversational prompts connected to live SAP S/4HANA data via SAP AI Core and OData services."

"Each Joule Skill was built on a consistent architecture: intent handling in SAP AI Core, business logic in a CAP Node.js service, and integration with S/4HANA APIs - making every skill maintainable and extensible."