Skip to content

doctornick42/fat-advisor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

FatAdvisor

FatAdvisor is an AI-powered nutrition assistant built in .NET, which uses the FatSecret API to fetch food / weight logs and leverages a language model to provide personalized dietary suggestions and analysis.

This project is currently in prototype / proof-of-concept stage, starting from a console interface and expanding toward more advanced “agentic” features.
In Part 1 of the dev blog series you can find an overview of the architecture and goals.


Table of Contents


Motivation

Many users already track their meals, calories, and weight using services like FatSecret. The idea behind FatAdvisor is to let an AI agent:

  • fetch your recent nutrition and weight logs
  • reason over trends (macros, calories, deviations)
  • respond to natural-language questions like “What should I eat tonight?” or “How to reduce carbs tomorrow?” using personalized data, not generic advice.

Architecture & Components

The solution is organized as a Visual Studio / .NET solution with three main projects. These correspond to layered responsibilities:

Project Purpose / Responsibility Key Features
FatAdvisor.Console The entry point / UI for user interaction Hosts Program.cs, configures dependency injection, logging, hosts the runner
FatAdvisor.Ai Core logic for AI orchestration Wraps Semantic Kernel, builds prompt templates, handles decision logic
FatAdvisor.FatSecretApi Integration with the FatSecret REST API DTOs, HTTP client, authentication, request/response models

Project Dependencies & Flow

  • FatAdvisor.Console depends on both Ai and FatSecretApi.
  • Ai may depend on FatSecretApi (or at least its models), e.g. if the agent wants to call API functions.
  • FatSecretApi is self-contained with no dependencies on AI logic.

Configuration & Secrets

Sensitive configuration is usually loaded via User Secrets or environment variables. This avoids checking keys into source control.

You must configure at least:

GitHubModels:ApiKey — your token for the LLM service

GitHubModels:Endpoint — the API endpoint (e.g. models inference endpoint)

Example:

cd FatAdvisor.Console
dotnet user-secrets init
dotnet user-secrets set "GitHubModels:ApiKey" "<your-token>"
dotnet user-secrets set "GitHubModels:Endpoint" "https://models.inference.ai.azure.com"

You can also override via environment variables if preferred.


Future Plans & Roadmap

Here are some enhancements you might consider (some already hinted in the blog post):

  • Add error handling, rate limiting, caching, and retry logic
  • Improve prompt engineering (chain-of-thought, few-shot examples)
  • Add web UI

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages