
Amirhosein Akbari developed and enhanced educational resources on large language models in the SharifiZarchi/Introduction_to_Machine_Learning repository over a two-month period. He created a comprehensive LaTeX slide deck covering foundational machine learning concepts, with a focus on parameter-efficient fine-tuning techniques such as Adapters, LoRA, and ULMFiT. His work included technical writing, presentation development, and the integration of visuals and references to support onboarding and project scoping. By updating slides with new images and revised text, Amirhosein improved the clarity and depth of LLM concepts, resulting in a centralized, reusable resource for team knowledge transfer and future machine learning initiatives.
December 2024: Delivered an enhanced Large Language Models (LLM) presentation slide in the SharifiZarchi/Introduction_to_Machine_Learning repository. Updated the LaTeX slide with new images and revised text to provide a more comprehensive overview of LLM concepts and architectures.
December 2024: Delivered an enhanced Large Language Models (LLM) presentation slide in the SharifiZarchi/Introduction_to_Machine_Learning repository. Updated the LaTeX slide with new images and revised text to provide a more comprehensive overview of LLM concepts and architectures.
November 2024 monthly delivery: Created and delivered the LLM Foundations and PEFT Slide Deck in the SharifiZarchi/Introduction_to_Machine_Learning repo. The deck covers foundational concepts of language models, specifics of LLMs, and parameter-efficient fine-tuning (PEFT) techniques including Adapters, Compacters, BitFit, LoRA, QLoRA, Prefix Tuning, and ULMFiT, with visuals and references to support onboarding, stakeholder communication, and project scoping.
November 2024 monthly delivery: Created and delivered the LLM Foundations and PEFT Slide Deck in the SharifiZarchi/Introduction_to_Machine_Learning repo. The deck covers foundational concepts of language models, specifics of LLMs, and parameter-efficient fine-tuning (PEFT) techniques including Adapters, Compacters, BitFit, LoRA, QLoRA, Prefix Tuning, and ULMFiT, with visuals and references to support onboarding, stakeholder communication, and project scoping.

Overview of all repositories you've contributed to across your timeline