Skip to content

Home

Document Tools

pypi python Build Status codecov

🔧 Tools to automate your document understanding tasks.

This package contains tools to automate your document understanding tasks by leveraging the power of 🤗 Datasets and 🤗 Transformers.

With this package, you can (or will be able to):

  • 🚧 Create a dataset from a collection of documents.
  • Transform a dataset to a format that is suitable for training a model.
  • 🚧 Train a model on a dataset.
  • 🚧 Evaluate the performance of a model on a dataset of documents.
  • 🚧 Export a model to a format that is suitable for inference.

Features

This project is under development and is in the alpha stage. It is not ready for production use, and if you find any bugs or have any suggestions, please let us know by opening an issue or a pull request.

Usage

One-liner to get started:

from datasets import load_dataset
from document_tools import tokenize_dataset

# Load a dataset from 🤗 Hub
dataset = load_dataset("deeptools-ai/test-document-invoice", split="train")

# Tokenize the dataset
tokenized_dataset = tokenize_dataset(dataset, target_model="layoutlmv3")

For more information, please see the documentation

Credits

This package was created with Cookiecutter and the waynerv/cookiecutter-pypackage project template.