- San Diego AI Surf App (Could be scaled to the rest of the world)
- Uses the spots I love to go to
- Way to show my knowledge of DB, JWT, DOCKER, and learn LLMs
easy method
- Download Docker 🐋
- Run this in the terminal
docker-compose up --build
- This step may take a while because it has to download the lite version of the AI
- Depends how fast ur wifi is
- Access the website at this link
http://127.0.0.1:8012
What are the db tables
- user
- id (int)
- username (string)
- email (string)
- favorite_spots (json data, holds the name and spot id for that spot)
- password (hashed)
- spot_forecast
- id (int)
- spot_id (string)
- spot_name (string)
- time (DateTime obj)
- surf_min (int)
- surf_max (int)
- wave_height (double, avg of the min and max)
- wind_speed (double)
- wind_direction (double used for finding onshore, and offshore wind)
- surf_optimial (value given by surfline)
- spot_id
- id (int)
- name (string)
- spot_id (string)
General Info to know
- Uses JWT
- All Api methods are found in
auth.py
- Contains pretty much every neccesary CRUD method inside for the app
pretty self explantory
- Only hold the spot names and surfline spot id
- the reason the surfline spot id is so important is because it used to fetch the data from surfline
- using the data direct and stored in the spot_forecast
- All Api methods are found in
spot.py
- Currently the only way to add spots is through postman
- The reason I have so little amount of spots is because I dont want my computer to take to long
-
- you have to manualy find the spot ids, refrence
This one is a bit more complicated than user
- How does it work?
- Every time the server starts it runs a script on all the spots found inside the spot_id table
- This script is found in the data directorty
- All the relevant information is then compiled into the directory
- It takes the surf in 3 hour increments
- Most of the time only the most recent data is fetched through the date time feature
- Improvement that could be made, delete old info, i havent decide whether or not im going to use it, but i might not
- This is mainly about the
ask()
endpoint inside theshelly.py
file - I am using a template before everytime the AI is prompted, gives context through directions to respond, chat history, question asked, and conditions
- I then use the
chain.invoke()
with the variables needed inside that, and return that as a the response - There is some weird error checking just because I was having problems earlier, but nothing wrong with having some extra checking
- The ollama port is 11434
- This is a different port than the flask app
- The method used in
langchain_ollama
acesses this port - Im a using the llama3.2:1b model, smaller size, easier to download, faster, and no harder info for the chatbot to interpret
- For more information on lanchain_ollama, use this refrence
- Found inside the template dirct
- I use a navbar.html that I include in all the pages
- For the fading effect It just some CSS tranisitions activated after a timeout
- Every type of transition can be found in its respective js file
- Used Tailwind for most of the project
- Check static -> style.css for custom styles
- Also some just edits in spefic element for the style tag
- Most of the JS is for fetching the db and putting info on the page
- The special tranisiton for the account page is found in accountBackground.js
No need to read just some my thoughts
- When I first started coding I absolutely hated working in the backend. I thought website design was more of my thing and I stuck to just updating the front end on most of my school projects
- I didn't start to enjoy backend code until my junior year of computer science, by that time the backend that we used in class was made with java springboot.
- However, in my sophomore year in high school, the year of mainly frontend work, I never fully understood the Python Flask backend we used.
- This project was not only something that I wanted to put on my resume that had to do with AI, but it was also to prove I could use Flask at a high level.
- At the end of my high school year I had mostly mastered every aspect of full-stack development, but not knowing how Flask worked always ate at me.
- This project proves to me that I can code flask repos, and that I have mastered all the tools I used in my high school career.
- This is a good confidence boost for me as I enter the next computer science chapter, most likely college. I most likely will be learning react/typescript on my next project cause coding everything in HTML, JS, and CSS is annoying af. Plus I got a couple of cool ideas for it.