The ChatGPT API has a usage-based pricing mechanism, with charges defined by the quantity of tokens handled, including both input and output tokens. This means that the overall number of tokens in a discussion, including text chunks exchanged by the user and the AI model, determines the ultimate expenses. The pricing structure includes several models, each with unique characteristics and price points. Costs are computed per 1,000 tokens, which is roughly comparable to 750 words, giving users flexibility and transparency in pricing for more details priceing.