The project's goal is to use different Deep Learning techniques - T5 Transformer, Encoder & Decoder with BiLSTM models, and NLP to generate coherent summaries – to generate brief descriptions of news stories.
- Updated
Jul 30, 2025 - Jupyter Notebook
The project's goal is to use different Deep Learning techniques - T5 Transformer, Encoder & Decoder with BiLSTM models, and NLP to generate coherent summaries – to generate brief descriptions of news stories.
Repository containing the project for the course on Business and Project Management at the University of Pisa (A.Y. 2022/2023) realized by Fabiano Pilia, Emanuele Tinghi and Matteo Dal Zotto.
Finetuned FLAN-T5 to translate English to Hawaiian Pidgin
A news headline generator finetuned on T5-base.
This project summarizes large text from any article to a smaller version without any loss in context. It uses the T5 Base transformer model.
Fine-tuning the T5-base model using parts of the CNN/DailyMail dataset using PyTorch Lightning.
Fine-Tuning LLM for summarization in Portuguese (T5)
A NLP based summarizer which can summarize news article.
This repository consists of the implementation of the paper "Automatic Ellipsis Reconstruction in Coordinated German Sentences based on Text-To-Text Transfer Transformers" which is accepted at 27th International Conference on Text, Speech and Dialogue.
German-to-English fine-tuned T5 base model with dependency-parsing enhancement.
A django application that allows you to summarize the content of a press article found by a user. It uses the T5 model for summarization, the LDA algorithm for topic modeling and Selenium to scrape the content of the linked article.
A RAG (Retrieval augmented generation)-based FAQ Chat-Bot, designed to operate within an organization's internal domain. - Jul 2023 - Oct 2023
This repository hosts an AI-powered chatbot using LLaMA 2 to promote constitutional literacy. It provides interactive answers on citizen rights, fundamental duties, and legal provisions. Built with Hugging Face Transformers, Streamlit, and a user-friendly UI, it simplifies legal knowledge for all. Contributions and improvements are welcome! 🚀
An agent-based reader view plugin for summarizing an article or web page, using Fetch.ai's uAgents framework and the t5-base integration.
finetuning t5-base model for detoxifying texts.
Institute Technical Summer Project -23/24
This repository contains the code and outputs for the CS505: Natual Language Processing course project. The objective of this work is to explore the performance of different machine learning models in generating commit messages from changes in code.
Thesis scope: Train and Develop a Table-to-Text Transformer-based model for contextual summarization of tabular data. To achieve this T5-small , T5-base, Bart-base and Llama2 7B chat were finetuned on ToTTo and QTSumm. Regarding ToTTo, the models outperformed the benchmark.
The GitHub repository Information_Summarizer by Prathamesh Patil is a versatile tool built using Django, React, and Tailwind CSS. It allows users to generate summaries from various sources like plain text, PDFs, DOCX files, video transcripts, and website URLs, utilizing an advanced NLP model—Trained T5-base, trained on the abisee/cnn_dailymail 3.0.
Add a description, image, and links to the t5-base topic page so that developers can more easily learn about it.
To associate your repository with the t5-base topic, visit your repo's landing page and select "manage topics."