Skip to content

Commit

Permalink
feat: model configuration to support different models (#183)
Browse files Browse the repository at this point in the history
Co-authored-by: hiroki osame <hiroki.osame@gmail.com>
  • Loading branch information
SixiS and privatenumber committed Apr 1, 2023
1 parent ebe83a4 commit eee3bbf
Show file tree
Hide file tree
Showing 5 changed files with 18 additions and 3 deletions.
6 changes: 6 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -180,6 +180,12 @@ To clear the proxy option, you can use the command (note the empty value after t
aicommits config set proxy=
```

#### model

Default: `gpt-3.5-turbo`

The Chat Completions (`/v1/chat/completions`) model to use. Consult the list of models available in the [OpenAI Documentation](https://platform.openai.com/docs/models/model-endpoint-compatibility).

## How it works

This CLI tool runs `git diff` to grab all your latest code changes, sends them to OpenAI's GPT-3, then returns the AI generated commit message.
Expand Down
1 change: 1 addition & 0 deletions src/commands/aicommits.ts
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@ export default async (
try {
messages = await generateCommitMessage(
config.OPENAI_KEY,
config.model,
config.locale,
staged.diff,
config.generate,
Expand Down
1 change: 1 addition & 0 deletions src/commands/prepare-commit-msg-hook.ts
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@ export default () => (async () => {
try {
messages = await generateCommitMessage(
config.OPENAI_KEY,
config.model,
config.locale,
staged!.diff,

Check warning on line 46 in src/commands/prepare-commit-msg-hook.ts

View workflow job for this annotation

GitHub Actions / Test (ubuntu-latest)

Forbidden non-null assertion

Check warning on line 46 in src/commands/prepare-commit-msg-hook.ts

View workflow job for this annotation

GitHub Actions / Test (windows-latest)

Forbidden non-null assertion

Check warning on line 46 in src/commands/prepare-commit-msg-hook.ts

View workflow job for this annotation

GitHub Actions / Test (ubuntu-latest)

Forbidden non-null assertion

Check warning on line 46 in src/commands/prepare-commit-msg-hook.ts

View workflow job for this annotation

GitHub Actions / Test (windows-latest)

Forbidden non-null assertion
config.generate,
Expand Down
8 changes: 8 additions & 0 deletions src/utils/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ import fs from 'fs/promises';
import path from 'path';
import os from 'os';
import ini from 'ini';
import type { TiktokenModel } from '@dqbd/tiktoken';
import { fileExists } from './fs.js';
import { KnownError } from './error.js';

Expand Down Expand Up @@ -60,6 +61,13 @@ const configParsers = {

return url;
},
model(model?: string) {
if (!model || model.length === 0) {
return 'gpt-3.5-turbo';
}

return model as TiktokenModel;
},
} as const;

type ConfigKeys = keyof typeof configParsers;
Expand Down
5 changes: 2 additions & 3 deletions src/utils/openai.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import https from 'https';
import type { ClientRequest, IncomingMessage } from 'http';
import type { CreateChatCompletionRequest, CreateChatCompletionResponse } from 'openai';
import { encoding_for_model as encodingForModel } from '@dqbd/tiktoken';
import { type TiktokenModel, encoding_for_model as encodingForModel } from '@dqbd/tiktoken';
import createHttpsProxyAgent from 'https-proxy-agent';
import { KnownError } from './error.js';

Expand Down Expand Up @@ -99,10 +99,9 @@ const deduplicateMessages = (array: string[]) => Array.from(new Set(array));

const getPrompt = (locale: string, diff: string) => `Write an insightful but concise Git commit message in a complete sentence in present tense for the following diff without prefacing it with anything, the response must be in the language ${locale}:\n${diff}`;

const model = 'gpt-3.5-turbo';

export const generateCommitMessage = async (
apiKey: string,
model: TiktokenModel,
locale: string,
diff: string,
completions: number,
Expand Down

0 comments on commit eee3bbf

Please sign in to comment.