LLM-for-GoFreight-API-Guide is an innovative tool designed to simplify the way developers interact with API documentation. It harnesses the power of LLM to provide users with intuitive guidance on using their APIs effectively. By feeding the openapi.json file into this tool, developers can obtain clear instructions on how to execute API calls, understand the functionality of various endpoints, and troubleshoot common issues.
Before you begin, ensure you have the following:
- Python 3.11 or higher installed on your system.
- an
openapi.jsonfile you wish to analyze. - An API key from AWS Bedrock.
Use the following command to launch the mock server:
$ npm install @stoplight/prism-cli
$ npx prism mock api-spec.yamlEnsure you have Poetry installed on your system. If you do not have Poetry installed, you can install it by following the instructions at the Poetry documentation.
With Poetry installed, you can set up your project environment and install all required dependencies by running:
poetry installTo run the LLM-for-GoFreight-API-Guide tool, you need to configure it with gofreight api key, the path to your openapi.json file, and your website's base URL.
Run the tool using the following command:
python3 api_selector.py --openapi-json <openapi-json-path> --base-url <your-website-base-url> --gf_api_key <your-gofreight-api-key>The LLM-for-GoFreight-API-Guide consists of several components that work together to provide an interactive experience for analyzing and understanding API documentation. Here's a breakdown of each file and its role:
The OpenAPI Parser is a bespoke component that ingests your openapi.json specification. Its primary function is to parse and extract essential details such as endpoints, parameters, and schema definitions, converting them into a structured format that the GPT-3.5 language model can process and interpret.
The api_selector.py file serves as the entry point for the tool. It accepts command-line arguments to initialize the application with user-provided details, the path to the openapi.json file, and the API's base URL.
The main.py file contains the ApiSelector class, which is responsible for API selection based on the given openapi.json file. It also oversees the initiation of the agent that handles the interaction between the user and the tool.
Within engine.py, the ProcessingEngine class is defined. This class is tasked with the processing of user queries. It formulates the requests to the language model and interprets the responses to provide answers.
The chat.py file manages the chat interface. It is in charge of storing the conversation history and sending requests to the AWS Bedrock API.
agent.py acts as the orchestrator for all the separate components. It is responsible for loading the system and user prompts and starting the chat and processing engines.
