About Me
Education & Research
I'm a 2nd Year Stanford University student studying Computer Science with a concentration in artificial intelligence and computer systems design.
Currently working at the Stanford Natural Language Processing Group in Stanford's AI Lab (SAIL) under Jon Saad-Falcon and advised by Christopher Ré (Hazy Research) & Azalia Mirhoseini (Scaling Intelligence Lab).
Also working at a Hoover Institution lab with Sergey Sanovich developing detection AI systems for LLM generated content on foreign propaganda.
Previously worked under Neil Malhotra at Stanford Graduate School of Business on fine-tuning LLMs on Supreme Court data and prediction systems.
Professional Experience
This summer I'll be in NYC as a quant developer at DriveWealth, a series-D startup.
Currently working as a technical Co-Founder and CTO of PocketChange – a digital gift card trading platform with a $50,000 seed funding from Buc Ventures, reaching the final interview round of YC (W25) and Techstars NYC (both top 1% of 10,000+ applicants).
Technical Interests
Passionate about ML theory & fundamentals, NLP, computer vision, deep learning, software engineering, and full-stack development.
Currently working on projects involving deep-learning, fintech, and NLPs.
Experienced in end-to-end DevOps architecture, algorithms, and full-stack development.
Beyond Tech
Interested in start-ups, entrepreneurship, international politics, and global affairs.
In my free time, I read books about apologetics, study Christian theology, watch movies, cook, and play soccer.
Want to know more? Feel free to reach out via gabebo@stanford.edu
Projects
Click on to view in detail!
Experience
Scroll to see more!
Research
GPT Meets Graphs and KAN Splines: Testing Novel Frameworks on Multitask Fine-Tuned GPT-2 with LoRA
We explore the potential of integrating learnable and interpretable modules—specifically Kolmogorov-Arnold Networks (KAN) and graph-based representations—within a pre-trained GPT-2 model to enhance multi-task learning accuracy. Motivated by the recent surge in using KAN and graph attention (GAT) architectures in chain-of-thought (CoT) models and debates over their benefits compared to simpler architectures like MLPs, we begin by enhancing a standard self-attention transformer using Low-Rank Adaptation (LoRA), fine-tuning hyperparameters, and incorporating L2 regularization. This approach yields significant improvements. To further boost interpretability and richer representations, we develop two variants that attempt to improve the standard KAN and GAT: Graph LoRA and Hybrid-KAN LoRA (Learnable GPT). However, systematic evaluations reveal that neither variant outperforms the optimized LoRA-enhanced transformer, which achieves 55.249% accuracy on the SST test set, 99.18% on the CFIMDB dev set, and 89.9% paraphrase detection test accuracy. On sonnet generation, we get a CHRF score of 42.097. These findings highlight that efficient parameter adaptation via LoRA remains the most effective strategy for our tasks: sentiment analysis, paraphrase detection, and sonnet generation.
Deep Learning Approaches for Blood Disease Diagnosis Across Hematopoietic Lineages
We present a foundation modeling framework that leverages deep learning to uncover latent genetic signatures across the hematopoietic hierarchy. Our approach trains a fully connected autoencoder on multipotent progenitor cells, reducing over 20,000 gene features to a 256-dimensional latent space that captures predictive information for both progenitor and downstream differentiated cells such as monocytes and lymphocytes. We validate the quality of these embeddings by training feed-forward, transformer, and graph convolutional architectures for blood disease diagnosis tasks. We also explore zero-shot prediction using a progenitor disease state classification model to classify downstream cell conditions. Our models achieve greater than 95% accuracy for multi-class classification, and in the zero-shot setting, we achieve greater than 0.7 F1-score on the binary classification task. Future work should improve embeddings further to increase robustness on lymphocyte classification specifically.
Can Benjamin Keep His Throne as the King of Currency?
The US dollar (USD) has been the default global reserve currency since the end of World War II, meaning most countries see the dollar as a safe investment and typically keep it in their coffers to conduct trade. At its highest, more than 71% of the world's central bank reserves utilized dollar assets—that number has dropped to 59% today, but it still remains unrivaled. This paper explores the factors behind the dollar's continued dominance despite challenges from currencies like the Euro, Chinese Yuan, and cryptocurrencies, examining why these alternatives have failed to dethrone the USD as the world's primary reserve currency.
Skills & Technologies
Whatchu looking for?