Chris Shinnimin

Final Fantasy Bot Project Journal

Final Fantasy Bot Project Journal

A fun personal project to learn LLMs and React, and rekindle my love of a favourite childhood game.

September 11, 2025

Prev | Project Journal Entries | Github | Next

React App Connected to the LLM!

Key learnings today:

  • React function components are not so scary after all.
    • After initially finding the readability difficult, it only took a day to get more comfortable with function components. A few well placed code comments to lay out what's happening goes a long way.
  • A model may behave differently via the API than the CLI.
    • After switching to falcon3 due to improved performance with my CLI testing yesterday, once I got the React bot app connected to the LLM API today, falcon3 couldn't understand the exact same instructions it was given via the CLI. I swotched back to Llama 3.2 and it understands the instructions properly.

Getting Close to a Full Scale POC!

Today's blog writeup will be short. I'll let my video do most of the talking. A summary of today's accomplishments:

  • Refactored all components in the react app to be function components to align with best practice.
  • Separated the concepts of "App Messages" and "LLM Messages" and created separate types to represent each. Although both currently have a similar structure (each contain a role/persona and a content/message attribute), I decided it was best to model them as separate entities for a couple of reasons:
    • They represent different "real world" entites. An App Message represents a message in the chat window of the app. An LLM Message represents a message sent either to or from the LLM by the app.
    • Not every LLM message will also generate an app message. There will be LLM messages behind the scenes, so the entities are different.
    • Even though they have similar attributiues now, we need to think about scalability and how they might diverge if the app becomes more complex.
  • Added the LLM hook and an LLM API to the bot app. The code that interacts with the API directly is in an API file in /src/api/ollamaApi.ts. Keeping it separate from the hook has several advantages:
    • Separation of concerns. The api file is concerned with interacting with the API only, whereas the getLLM hook remains focused on managing React state and UI logic.
    • Reusability. Easier to reuse the API logic in isolation.
    • Organization. If we want to implement different APIs, perhaps alternatives to Ollama, they can be created and swapped in and out of use in the hook quickly and easily, istead of having to make extensive modifications to the hook.
  • Prepared for the next day on the project by creating a utility method to read the NES ram contents from the RAMdisk via a symlink. Tested and working. So, next project day, I can get off to the races building the RRR and RWR modules.

Prev | Project Journal Entries | Github | Next

Demo of Today's Accomplishments