Cookie Consent by Free Privacy Policy Generator 📌 Text translation using langchain.js and Gemini in a NestJS application

🏠 Team IT Security News

TSecurity.de ist eine Online-Plattform, die sich auf die Bereitstellung von Informationen,alle 15 Minuten neuste Nachrichten, Bildungsressourcen und Dienstleistungen rund um das Thema IT-Sicherheit spezialisiert hat.
Ob es sich um aktuelle Nachrichten, Fachartikel, Blogbeiträge, Webinare, Tutorials, oder Tipps & Tricks handelt, TSecurity.de bietet seinen Nutzern einen umfassenden Überblick über die wichtigsten Aspekte der IT-Sicherheit in einer sich ständig verändernden digitalen Welt.

16.12.2023 - TIP: Wer den Cookie Consent Banner akzeptiert, kann z.B. von Englisch nach Deutsch übersetzen, erst Englisch auswählen dann wieder Deutsch!

Google Android Playstore Download Button für Team IT Security



📚 Text translation using langchain.js and Gemini in a NestJS application


💡 Newskategorie: Programmierung
🔗 Quelle: dev.to

Introduction

I am a language learner who learns Mandarin and Spanish in my spare time. When I discovered that text translation using langchain.js and Gemini 1.0 Pro is possible, I wanted to leverage the strength of generative AI in my hobby. Therefore, I built a NestJS application to translate texts between two languages through LangChain LLM and Gemini 1.0 Pro.

What is langchain.js?

Langchain is a framework for developing applications powered by language models. It offers libraries to create prompts, text embedding, retrievers, and chat models and integrate with third parties such as Google.

Generate Gemini API Key

Go to https://aistudio.google.com/app/apikey to generate an API key for a new or an existing Google Cloud project.

Create a new NestJS Project

nest new nestjs-genai-translation

Install dependencies

npm i --save-exact  zod @nestjs/swagger @nestjs/throttler dotenv compression helmet langchain @langchain/google-genai

Generate a Translation Module

nest g mo translation
nest g co translation/http/translator --flat
nest g s translation/application/langchainTranslator --flat

Create a Translation module, a controller, and a service for the API.

Define Gemini environment variables

// .env.example

PORT=3000
GOOGLE_GEMINI_API_KEY=<google gemini api key>
GOOGLE_GEMINI_MODEL=gemini-pro
AI_SERVICE=langchain_googleChatModel

Copy .env.example to .env, and replace GOOGLE_GEMINI_API_KEY and GOOGLE_GEMINI_MODEL with the actual API Key and the Gemini model name, respectively.

  • GOOGLE_GEMINI_API_KEY - Gemini API key of Gemini
  • GOOGLE_GEMINI_MODEL - Gemini mode name. In this application, I use Gemini 1.0 Pro to perform text translations
  • AI_SERVICE - Generative AI service to be used in the application

Add .env to the .gitignore file to prevent accidentally committing the Gemini API Key to the GitHub repo.

Add configuration files

The project has 3 configuration files. validate.config.ts validates the payload is valid before any request can route to the controller to execute.

// validate.config.ts

import { ValidationPipe } from '@nestjs/common';

export const validateConfig = new ValidationPipe({
  whitelist: true,
  stopAtFirstError: true,
  forbidUnknownValues: false,
});

env.config.ts extracts the environment variables from process.env and stores the values in the env object.

// env.config.ts

import dotenv from 'dotenv';
import { Integration } from '~core/types/integration.type';

dotenv.config();

export const env = {
  PORT: parseInt(process.env.PORT || '3000'),
  GEMINI: {
    API_KEY: process.env.GOOGLE_GEMINI_API_KEY || '',
    MODEL_NAME: process.env.GOOGLE_GEMINI_MODEL || 'gemini-pro',
  },
  AI_SERVICE: (process.env.AI_SERVICE || 'langchain_googleChatModel') as Integration,
};

throttler.config.ts defines the rate limit of the Translation API

// throttler.config.ts

import { ThrottlerModule } from '@nestjs/throttler';

export const throttlerConfig = ThrottlerModule.forRoot([
  {
    ttl: 60000,
    limit: 10,
  },
]);

Each route allows ten requests in 60,000 milliseconds or 1 minute.

Bootstrap the application

// bootstrap.ts

export class Bootstrap {
  private app: NestExpressApplication;

  async initApp() {
    this.app = await NestFactory.create(AppModule);
  }

  enableCors() {
    this.app.enableCors();
  }

  setupMiddleware() {
    this.app.use(express.json({ limit: '1000kb' }));
    this.app.use(express.urlencoded({ extended: false }));
    this.app.use(compression());
    this.app.use(helmet());
  }

  setupGlobalPipe() {
    this.app.useGlobalPipes(validateConfig);
  }

  async startApp() {
    await this.app.listen(env.PORT);
  }

  setupSwagger() {
    const config = new DocumentBuilder()
      .setTitle('Generative AI Translator')
      .setDescription('Integrate with Generative AI to translate a text from one language to another language')
      .setVersion('1.0')
      .addTag('Azure OpenAI, Langchain Gemini AI Model, Google Translate Cloud API')
      .build();
    const document = SwaggerModule.createDocument(this.app, config);
    SwaggerModule.setup('api', this.app, document);
  }
}

Add a Bootstrap class to set up Swagger, middleware, global validation, cors, and finally, application start.

// main.ts

import { Bootstrap } from '~core/bootstrap';

async function bootstrap() {
  const bootstrap = new Bootstrap();
  await bootstrap.initApp();
  bootstrap.enableCors();
  bootstrap.setupMiddleware();
  bootstrap.setupGlobalPipe();
  bootstrap.setupSwagger();
  await bootstrap.startApp();
}
bootstrap()
  .then(() => console.log('The application starts successfully'))
  .catch((error) => console.error(error));

The bootstrap function enables CORS, registers middleware to the application, sets up Swagger documentation, and uses a global pipe to validate payloads.

I have laid down the groundwork and the next step is to add routes to receive payload to translate texts between source language and target language.

Define Translation DTO

// languages_codes.validation.ts

import { z } from 'zod';

const LANGUAGE_CODES = {
  English: 'en',
  Spanish: 'es',
  'Simplified Chinese': 'zh-Hans',
  'Traditional Chinese': 'zh-Hant',
  Vietnamese: 'vi',
  Japanese: 'ja',
} as const;

export const ZOD_LANGUAGE_CODES = z.nativeEnum(LANGUAGE_CODES, {
  required_error: 'Language code is required',
  invalid_type_error: 'Language code is invalid',
});
export type LanguageCodesType = z.infer<typeof ZOD_LANGUAGE_CODES>;
// translate-text.dto.ts

import { z } from 'zod';
import { ZOD_LANGUAGE_CODES } from '~translation/application/validations/language_codes.validation';

export const translateTextSchema = z
  .object({
    text: z.string({
      required_error: 'Text is required',
    }),
    srcLanguageCode: ZOD_LANGUAGE_CODES,
    targetLanguageCode: ZOD_LANGUAGE_CODES,
  })
  .required();

export type TranslateTextDto = z.infer<typeof translateTextSchema>;

translateTextSchema accepts a text, a source language code, and a target language code. Then, I use zod to infer the type of translateTextSchema and assign it to TranslateTextDto.

Define Translator Interface

This application is designed to translate texts using either Azure OpenAI, langchain.js, Gemini Pro Model, or Google Translate Cloud API. Therefore, I created a Translator interface, and all services that implement the interface must fulfill the contract.

//  translator-input.interface.ts

import { LanguageCodesType } from '../validations/language_codes.validation';

export interface TranslateInput {
  text: string;
  srcLanguageCode: LanguageCodesType;
  targetLanguageCode: LanguageCodesType;
}
// translate-result.interface.ts

import { Integration } from '~core/types/integration.type';

export interface TranslationResult {
  text: string;
  aiService: Integration;
}
// translator.interface.ts

import { TranslationResult } from './translation-result.interface';
import { TranslateInput } from './translator-input.interface';

export interface Translator {
  translate(input: TranslateInput): Promise<TranslationResult>;
}

Implement Langchain Translator Service

// language_names.enum.ts

export enum LANGUAGE_NAMES {
  ENGLISH = 'English',
  JAPANESE = 'Japanese',
  SIMPLIFIED_CHINESE = 'Simplified Chinese',
  TRADITIONAL_CHINESE = 'Traditional Chinese',
  SPANISH = 'Spanish',
  VIETNAMESE = 'Vietnamese',
}

LANGUAGE_NAMES enum represents the language names of the language codes. The prompt template of LangChain uses the language names to create a formatted prompt to query the Gemini 1.0 Pro model.

// translator.constant.ts

export const GEMINI_CHAT_MODEL_LLM_CHAIN = 'GEMINI_CHAT_MODEL_LLM_CHAIN';
// translation-chain.provider.ts

// Omit the import statments due to brevity

const chatModel = new ChatGoogleGenerativeAI({
  modelName: env.GEMINI.MODEL_NAME,
  maxOutputTokens: 128,
  safetySettings: [
    {
      category: HarmCategory.HARM_CATEGORY_DANGEROUS_CONTENT,
      threshold: HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE,
    },
    {
      category: HarmCategory.HARM_CATEGORY_HARASSMENT,
      threshold: HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE,
    },
    {
      category: HarmCategory.HARM_CATEGORY_HATE_SPEECH,
      threshold: HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE,
    },
    {
      category: HarmCategory.HARM_CATEGORY_SEXUALLY_EXPLICIT,
      threshold: HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE,
    },
  ],
  temperature: 0,
  topK: 3,
  topP: 0.5,
  apiKey: env.GEMINI.API_KEY,
});

export const GEMINI_LLM_CHAIN_PROVIDER: Provider = {
  provide: GEMINI_CHAT_MODEL_LLM_CHAIN,
  useFactory: () => {
    const systemMessageTemplate = SystemMessagePromptTemplate.fromTemplate(
      'You are a helpful language translator that translates {srcLanguageName} to {targetLanguageName}',
    );
    const humanMessageTemplate = HumanMessagePromptTemplate.fromTemplate('{text}');
    const chatPrompt = ChatPromptTemplate.fromMessages([systemMessageTemplate, humanMessageTemplate]);

    const outputParser = new StringOutputParser();
    return chatPrompt.pipe(chatModel).pipe(outputParser);
  },
};

GEMINI_LLM_CHAIN_PROVIDER creates a provider that feeds the prompt to the chatbot of Gemini 1.0 Pro and outputs the string result.

// langchain-translator.service.ts

// Omit import statements for brevity

@Injectable()
export class LangchainTranslatorService implements Translator {
  readonly languageMapper = new Map<LanguageCodesType, LANGUAGE_NAMES>();

  constructor(@Inject(GEMINI_CHAT_MODEL_LLM_CHAIN) private readonly llmChain: Runnable<any, string>) {
    this.languageMapper.set('en', LANGUAGE_NAMES.ENGLISH);
    this.languageMapper.set('es', LANGUAGE_NAMES.SPANISH);
    this.languageMapper.set('ja', LANGUAGE_NAMES.JAPANESE);
    this.languageMapper.set('vi', LANGUAGE_NAMES.VIETNAMESE);
    this.languageMapper.set('zh-Hans', LANGUAGE_NAMES.SIMPLIFIED_CHINESE);
    this.languageMapper.set('zh-Hant', LANGUAGE_NAMES.TRADITIONAL_CHINESE);
  }

  async translate({ text, srcLanguageCode, targetLanguageCode }: TranslateInput): Promise<TranslationResult> {
    const srcLanguageName = this.languageMapper.get(srcLanguageCode);
    const targetLanguageName = this.languageMapper.get(targetLanguageCode);

    const translatedText = await this.llmChain.invoke({
      srcLanguageName,
      targetLanguageName,
      text: text,
    });

    return {
      text: translatedText,
      aiService: 'langchain_googleChatModel',
    };
  }
}

The translate method of LangchainTranslatorService uses the language codes to determine the source and target language names. Then, the language names and the text are passed to the LLM chain to obtain text translations. Finally, the method returns the translated text in the HTTP response.

Implement Translator Controller

// zod-validation.pipe.ts

export class ZodValidationPipe implements PipeTransform {
  constructor(private schema: ZodSchema) {}

  transform(value: unknown) {
    try {
      const parsedValue = this.schema.parse(value);
      return parsedValue;
    } catch (error) {
      console.error(error);
      if (error instanceof ZodError) {
        throw new BadRequestException(error.errors?.[0]?.message || 'Validation failed');
      } else if (error instanceof Error) {
        throw new BadRequestException(error.message);
      }
      throw error;
    }
  }
}

ZodValidationPipe is a pipe that validates the payload against the Zod schema. When the validation is successful, the payload will be parsed and returned. When the validation fails, the pipe intercepts the ZodError and returns an instance of BadRequestException.

// translator.controller.ts

// Omit the import statements to save space

@ApiTags('Translator')
@Controller('translator')
export class TranslatorController {
  constructor(@Inject(TRANSLATOR) private translatorService: Translator) {}

  @ApiBody({
    description: 'An intance of TranslatTextDto',
    required: true,
    schema: {
      type: 'object',
      properties: {
        text: {
          type: 'string',
          description: 'text to be translated',
        },
        srcLanguageCode: {
          type: 'string',
          description: 'source language code',
          enum: ['en', 'es', 'zh-Hans', 'zh-Hant', 'vi', 'ja'],
        },
        targetLanguageCode: {
          type: 'string',
          description: 'target language code',
          enum: ['en', 'es', 'zh-Hans', 'zh-Hant', 'vi', 'ja'],
        },
      },
    },
    examples: {
      greeting: {
        value: {
          text: 'Good morning, good afternoon, good evening.',
          srcLanguageCode: 'en',
          targetLanguageCode: 'es',
        },
      },
    },
  })
  @ApiResponse({
    description: 'The translated text',
    schema: {
      type: 'object',
      properties: {
        text: { type: 'string', description: 'translated text' },
        aiService: { type: 'string', description: 'AI service' },
      },
    },
    status: 200,
  })
  @HttpCode(200)
  @Post()
  @UsePipes(new ZodValidationPipe(translateTextSchema))
  translate(@Body() dto: TranslateTextDto): Promise<TranslationResult> {
    return this.translatorService.translate(dto);
  }
}

The TranslatorController injects Translator that is an instance of LangchainTranslatorService. The endpoint invokes the translate method to perform text translation using langchain.js and Gemini 1.0 Pro model.

Dynamic registration

This application registers the translation service based on the AI_SERVICE environment variable. The value of the environment variable is one of azureOpenAI, langchain_googleChatModel, and google_translate.

// .env.example

AI_SERVICE=langchain_googleChatModel
// integration.type.ts

export type Integration = 'azureOpenAI' | 'langchain_googleChatModel' | 'google_translate';
// translator.module.ts

// Omit import statements for brevity

function createProviders(serviceType: Integration) {
  const serviceMap = new Map<Integration, any>();
  serviceMap.set('azureOpenAI', AzureTranslatorService);
  serviceMap.set('langchain_googleChatModel', LangchainTranslatorService);
  const translatorService = serviceMap.get(serviceType);

  const providers: Provider[] = [
    {
      provide: TRANSLATOR,
      useClass: translatorService,
    },
  ];

  if (serviceType === 'langchain_googleChatModel') {
    providers.push(GEMINI_LLM_CHAIN_PROVIDER);
  }
  return providers;
}

@Module({
  imports: [HttpModule],
  controllers: [TranslatorController],
})
export class TranslationModule {
  static register(type: Integration = 'azureOpenAI'): DynamicModule {
    const logger = new Logger(TranslationModule.name);
    const isProduction = env.APP_ENV === APP_ENV_NAMES.PRODUCTION;
    // google_translation works in local environment. Default to azureOpenAI in production
    const serviceType = isProduction && type === 'google_translate' ? 'azureOpenAI' : type;

    logger.log(`isProduction? ${isProduction}`);
    logger.log(`serviceType? ${serviceType}`);

    return {
      module: TranslationModule,
      providers: createProviders(serviceType),
    };
  }
}

In TranslationModule, I define a register method that returns a DynamicModule. When type is langchain_googleChatModel, the TRANSLATOR token provides LangchainTranslatorService. Next, TranslationModule.register(env.AI_SERVICE) creates a TranslationModule that I import in the AppModule.

// app.module.ts

@Module({
  imports: [throttlerConfig, TranslationModule.register(env.AI_SERVICE)],
  controllers: [AppController],
  providers: [
    AppService,
    {
      provide: APP_GUARD,
      useClass: ThrottlerGuard,
    },
  ],
})
export class AppModule {}

Test the endpoints

I can test the endpoints with cURL, Postman or Swagger documentation after launching the application.

npm run start:dev

The URL of the Swagger documentation is http://localhost:3000/api.

In cURL

curl --location 'http://localhost:3000/translator' \
--header 'Content-Type: application/json' \
--data '{
    "text": "My name is John\n\nI am a Chinese",
    "srcLanguageCode": "en",
    "targetLanguageCode": "es"
}'

This is the end of my blog post that uses langchain.js and Gemini 1.0 Pro to solve a real-world problem. I only scratched the surface of LangChain LLM and LangChain has many resources to solve problems in different domains

Dockerize the application

// .dockerignore

.git
.gitignore
node_modules/
dist/
Dockerfile
.dockerignore
npm-debug.log

Create a .dockerignore file for Docker to ignore some files and directories.

// Dockerfile

# Use an official Node.js runtime as the base image
FROM node:20-alpine

# Set the working directory in the container
WORKDIR /app

# Copy package.json and package-lock.json to the working directory
COPY package*.json tsconfig.json ./

# Install the dependencies
RUN npm install

RUN npm run build

# Copy the rest of the application code to the working directory
COPY . .

# Expose a port (if your application listens on a specific port)
EXPOSE 3000

# Define the command to run your application
CMD [ "npm", "start" ]

I added the Dockerfile that installs the dependencies, builds the NestJS application, and starts it at port 3000.

//  .env.docker.example

PORT=3000
APP_ENV=<application environment>
AZURE_OPENAI_TRANSLATOR_API_KEY=<translator api key>
AZURE_OPENAI_TRANSLATOR_URL=<translator url>/translate
AZURE_OPENAI_TRANSLATOR_API_VERSION="3.0"
AZURE_OPENAI_LOCATION=eastasia
GOOGLE_GEMINI_API_KEY=<google gemini api key>
GOOGLE_GEMINI_MODEL=gemini-pro
AI_SERVICE=langchain_googleChatModel

.env.docker.example stores the relevant environment variables that I copied from the NestJS application.

// docker-compose.yaml

version: '3.8'

services:
  backend:
    build:
      context: ./nestjs-genai-translation
      dockerfile: Dockerfile
    environment:
      - PORT=${PORT}
      - APP_ENV=${APP_ENV}
      - AZURE_OPENAI_TRANSLATOR_API_KEY=${AZURE_OPENAI_TRANSLATOR_API_KEY}
      - AZURE_OPENAI_TRANSLATOR_URL=${AZURE_OPENAI_TRANSLATOR_URL}
      - AZURE_OPENAI_TRANSLATOR_API_VERSION=${AZURE_OPENAI_TRANSLATOR_API_VERSION}
      - AZURE_OPENAI_LOCATION=${AZURE_OPENAI_LOCATION}
      - GOOGLE_GEMINI_API_KEY=${GOOGLE_GEMINI_API_KEY}
      - GOOGLE_GEMINI_MODEL=${GOOGLE_GEMINI_MODEL}
      - AI_SERVICE=${AI_SERVICE}
    ports:
      - "${PORT}:${PORT}"
    networks:
      - ai
    restart: always

networks:
  ai:

I added the docker-compose.yaml in the root folder, which was responsible for creating the NestJS application container.

This concludes my blog post about using langchain.js and Gemini 1.0 Pro model to solve a real-world problem. I only scratched the surface of LangChain LLM, and LangLang supports many integrations to solve problems in different domains. I hope you like the content and continue to follow my learning experience in Angular, NestJS, and other technologies.

Resources:

...



📌 Text translation using langchain.js and Gemini in a NestJS application


📈 82.22 Punkte

📌 Queuing jobs in NestJS using @nestjs/bullmq package


📈 45.82 Punkte

📌 Meet the ‘LangChain Financial Agent’: An AI Fintech Project Built on Langchain and FastAPI


📈 36.36 Punkte

📌 Mastering LangChain: Part 1 - Introduction to LangChain and Its Key Components


📈 36.36 Punkte

📌 Getting started w/ Google's Gemini Pro LLM using Langchain JS


📈 33.87 Punkte

📌 LangChain GEN AI Tutorial – 6 End-to-End Projects using OpenAI, Google Gemini Pro, LLAMA2


📈 33.87 Punkte

📌 Getting started with Gemini API with NestJS


📈 31.83 Punkte

📌 Integrating an External API with a Chatbot Application using LangChain and Chainlit


📈 30.31 Punkte

📌 Using ChatGPT with Your Own Data using LangChain and Supabase


📈 29.27 Punkte

📌 Learn How to Create and Test a File Upload API using NestJS and Postman


📈 29.03 Punkte

📌 Generative AI Full Course – Gemini Pro, OpenAI, Llama, Langchain, Pinecone, Vector Databases & More


📈 28.77 Punkte

📌 Step-by-Step Guide: Setting Up a NestJS Application with Docker and PostgreSQL


📈 28.28 Punkte

📌 Integrate MongoDB database with multiple collections using Mongoose in NestJS and Typescript


📈 27.24 Punkte

📌 Reply in thread using NestJS and Gmail API


📈 27.24 Punkte

📌 How to Analyze Large Text Datasets with LangChain and Python


📈 27.15 Punkte

📌 Crow Translate: Desktop / CLI Text Translation App Using Google Translate, Yandex Translate and Bing Translator


📈 26.96 Punkte

📌 I asked Gemini and GPT-4 to explain deep learning AI, and Gemini won hands down


📈 26.52 Punkte

📌 Google renames Bard, launches Gemini Advanced offering, and announces new Gemini app for Android and iOS


📈 26.52 Punkte

📌 How to implement authentication with nestjs using guards in 3 easy steps


📈 25.46 Punkte

📌 Transform decorator (@transform) not working using web-pack in NestJs


📈 25.46 Punkte

📌 Getting an error when using @ValidateNested decorator in NestJs


📈 25.46 Punkte

📌 Use Amazon Bedrock and LangChain To Build an Application To Chat With Web Pages


📈 25.21 Punkte

📌 gmitohtml - Gemini to HTML proxy (Gemini is a protocol similar to Finger and Gopher)


📈 24.74 Punkte

📌 Bard is now Gemini ✨ Chat with Gemini to supercharge your ideas, write, learn, plan and more


📈 24.74 Punkte

📌 Imagination ➡️ images 🖼️ Try Gemini image generation and #ChatWithGemini at gemini.google.com.


📈 24.74 Punkte

📌 Google finally catches up with OpenAI, announces general availability of Gemini 1.0 Pro and Gemini 1.0 Ultra


📈 24.74 Punkte

📌 ‘Gemini Business’ and ‘Gemini Enterprise’ plans for Google Workspace are coming tomorrow


📈 24.74 Punkte

📌 Google might soon release Gemini Business and Gemini Enterprise


📈 24.74 Punkte

📌 Plain Text Editor 1.2.1 - Simple distraction-free text editor without any rich text nonsense.


📈 24.23 Punkte

📌 Facing an issue in froala text editor, style of the text is lost when the text is cut


📈 24.23 Punkte

📌 Run ChatGPT-Style Questions Over Your Own Files Using the OpenAI API and LangChain!


📈 24.17 Punkte

📌 How to Build a Legal Information Retrieval Engine Using Mistral, Qdrant, and LangChain


📈 24.17 Punkte











matomo