policy

Decoder-only-Transformer-Pre-training-

AyaanZ30 · PyTorch

or hover any field below to flag it

Overview

Name
Decoder-only-Transformer-Pre-training-
Author
AyaanZ30
Framework
PyTorch
License
unknown
Skill type
other
Evidence level
untested
Task description
This repository contains a custom implementation of a decoder-only transformer neural network, pre-trained from scratch on a corpus of Shakespearean text, including monologues and dialogues. Unlike large language models (LLMs) that are often fine-tuned and futher optimized (like PPO for GPT), this m

Spaces

Action space
other · 0-dim · 0Hz
Observation space
  • type: other

Links

HuggingFace repo
null
Paper (arXiv)
null

Compatible robots

20

Compatible environments

0

No environments list Decoder-only-Transformer-Pre-training- yet.

Datasets that reference this policy

0

No datasets reference Decoder-only-Transformer-Pre-training- yet.