LLM Hallucination Index

The LLM Hallucination Index 2023 by Galileo is a framework for ranking and evaluating the propensity of large language models (LLMs) to generate hallucinations - incorrect or fabricated responses.

The goal was to design a three page website to showcase the Hallucination Index and capture emails. The design should work as a sub brand of Galileo and provide clean and comprehensive reading experience.

As the only place where impactful visual treatment is welcomed was the hero, I conceptualized an animation that features letters with hypnotic effects to convey the idea of hallucinations of large language models. Together with a developer, we brought the idea to life. Worked on this while at Brightscout.

Nikola Milosevic
Welcome to my design portfolio on Dribbble

More by Nikola Milosevic

View profile