Hi, you can call me

Gabriel Bo

Stanford | SAIL | DriveWealth

Gabriel Bo

About Me

Education & Research

I'm a 2nd Year Stanford University student studying Computer Science with a concentration in artificial intelligence and computer systems design.

Currently working at the Stanford Natural Language Processing Group in Stanford's AI Lab (SAIL) under Jon Saad-Falcon and advised by Christopher Ré (Hazy Research) & Azalia Mirhoseini (Scaling Intelligence Lab).

Also working at a Hoover Institution lab with Sergey Sanovich developing detection AI systems for LLM generated content on foreign propaganda.

Previously worked under Neil Malhotra at Stanford Graduate School of Business on fine-tuning LLMs on Supreme Court data and prediction systems.

Professional Experience

This summer I'll be in NYC as a quant developer at DriveWealth, a series-D startup.

Currently working as a technical Co-Founder and CTO of PocketChange – a digital gift card trading platform with a $50,000 seed funding from Buc Ventures, reaching the final interview round of YC (W25) and Techstars NYC (both top 1% of 10,000+ applicants).

Technical Interests

Passionate about ML theory & fundamentals, NLP, computer vision, deep learning, software engineering, and full-stack development.

Currently working on projects involving deep-learning, fintech, and NLPs.

Experienced in end-to-end DevOps architecture, algorithms, and full-stack development.

Beyond Tech

Interested in start-ups, entrepreneurship, international politics, and global affairs.

In my free time, I read books about apologetics, study Christian theology, watch movies, cook, and play soccer.

Want to know more? Feel free to reach out via gabebo@stanford.edu

Experience

Scroll to see more!

Summer 2025

Post-Trade Quant SWE Intern

DriveWealth

New York, NY
  • Interning in quantitative software engineering
  • Working on post-trade systems and infrastructure
  • Series-D quantitative finance startup
2025 - Present

Research Assistant

SAIL NLP Group

Stanford, CA
  • Researching NLP, ML, retrieval, and benchmarking
  • Working with Hazy Lab and Scaling Intelligence
  • Focusing on advanced machine learning applications
2025 - Present

Software Engineer & ML Researcher

AfterQuery

San Francisco, CA
  • Founding team member working on LLM benchmarking
  • Contributing to improvements on foundational model
  • YC W25 startup focusing on ML innovation
2024 - Present

Research Assistant

Hoover Institute

Stanford, CA
  • Researching with Sergey Sanovich on LLMs and NLP
  • Developing detection AI systems for LLM-generated content
  • Focusing on foreign propaganda analysis
2023 - 2025

Co-founder & CTO

PocketChange

Stanford, CA
  • Co-founded digital gift card trading platform
  • Secured $50k in funding
  • Reached YC W25 Final Round Interview
2023 - 2024

Research Assistant

Stanford Graduate School of Business

Stanford, CA
  • Researched in Neil Malhotra's Lab at GSB
  • Worked on fine-tuning LLMs on Supreme Court data
  • Developed prediction systems for legal analytics
2023 - Present

Undergraduate Student

Stanford University

Stanford, CA
  • Majoring in Computer Science with AI & Systems specialization
  • Engaging in research and entrepreneurship
  • Building technical expertise across multiple domains
2022 - 2023

High School Student

Plano West Senior High

Plano, TX
  • Top 4 in class with summa cum laude honors
  • 4x AIME qualifier in mathematics
  • National Champion and 2x Finalist in International Extemp Speaking

Research

GPT Meets Graphs and KAN Splines: Testing Novel Frameworks on Multitask Fine-Tuned GPT-2 with LoRA

Gabriel Bo, Marc Bernardino, Justin GuMarch 2025Stanford CS224N Project

We explore the potential of integrating learnable and interpretable modules—specifically Kolmogorov-Arnold Networks (KAN) and graph-based representations—within a pre-trained GPT-2 model to enhance multi-task learning accuracy. Motivated by the recent surge in using KAN and graph attention (GAT) architectures in chain-of-thought (CoT) models and debates over their benefits compared to simpler architectures like MLPs, we begin by enhancing a standard self-attention transformer using Low-Rank Adaptation (LoRA), fine-tuning hyperparameters, and incorporating L2 regularization. This approach yields significant improvements. To further boost interpretability and richer representations, we develop two variants that attempt to improve the standard KAN and GAT: Graph LoRA and Hybrid-KAN LoRA (Learnable GPT). However, systematic evaluations reveal that neither variant outperforms the optimized LoRA-enhanced transformer, which achieves 55.249% accuracy on the SST test set, 99.18% on the CFIMDB dev set, and 89.9% paraphrase detection test accuracy. On sonnet generation, we get a CHRF score of 42.097. These findings highlight that efficient parameter adaptation via LoRA remains the most effective strategy for our tasks: sentiment analysis, paraphrase detection, and sonnet generation.

Show More

Deep Learning Approaches for Blood Disease Diagnosis Across Hematopoietic Lineages

Gabriel Bo, Justin Gu, Christopher SunMarch 2024Stanford CS229 Project

We present a foundation modeling framework that leverages deep learning to uncover latent genetic signatures across the hematopoietic hierarchy. Our approach trains a fully connected autoencoder on multipotent progenitor cells, reducing over 20,000 gene features to a 256-dimensional latent space that captures predictive information for both progenitor and downstream differentiated cells such as monocytes and lymphocytes. We validate the quality of these embeddings by training feed-forward, transformer, and graph convolutional architectures for blood disease diagnosis tasks. We also explore zero-shot prediction using a progenitor disease state classification model to classify downstream cell conditions. Our models achieve greater than 95% accuracy for multi-class classification, and in the zero-shot setting, we achieve greater than 0.7 F1-score on the binary classification task. Future work should improve embeddings further to increase robustness on lymphocyte classification specifically.

Show More

Can Benjamin Keep His Throne as the King of Currency?

Gabriel BoMarch 2024Stanford Economic Review

The US dollar (USD) has been the default global reserve currency since the end of World War II, meaning most countries see the dollar as a safe investment and typically keep it in their coffers to conduct trade. At its highest, more than 71% of the world's central bank reserves utilized dollar assets—that number has dropped to 59% today, but it still remains unrivaled. This paper explores the factors behind the dollar's continued dominance despite challenges from currencies like the Euro, Chinese Yuan, and cryptocurrencies, examining why these alternatives have failed to dethrone the USD as the world's primary reserve currency.

Show More

Skills & Technologies

Whatchu looking for?

AWS
Azure
C
C++
CSS
Docker
FastAPI
Firebase
Flutter
GCP
Git
GitHub
GitLab
HTML
Hugging Face
Java
JavaScript
Jenkins
Kotlin
Linux
MongoDB
Next.js
Node.js
NPM
NumPy
Pandas
PostgreSQL
Python
PyTorch
QuantLib
R
React
Redis
SciPy
Stata
Swift
Tailwind
TensorFlow
TypeScript