policy
Decoder-only-Transformer-Pre-training-
AyaanZ30 · PyTorch
or hover any field below to flag it
Overview
Name
Decoder-only-Transformer-Pre-training-
Author
AyaanZ30
Framework
PyTorch
License
unknown
Skill type
other
Evidence level
untested
Task description
This repository contains a custom implementation of a decoder-only transformer neural network, pre-trained from scratch on a corpus of Shakespearean text, including monologues and dialogues. Unlike large language models (LLMs) that are often fine-tuned and futher optimized (like PPO for GPT), this m
Spaces
Action space
other · 0-dim · 0Hz
Observation space
- type: other
Links
HuggingFace repo
null
Paper (arXiv)
null
Compatible robots
20anybotics-anymal-cnot in seedalohanot in seedgoogle-barkour-vbnot in seedboston-dynamics-spotnot in seedfranka-fr3not in seedgoogle-barkour-v0not in seedagilex-pipernot in seedberkeley-humanoidnot in seedbitcraze-crazyflie-2not in seedanybotics-anymal-bnot in seedagility-cassienot in seedarx-l5not in seedbooster-t1not in seedfranka-emika-pandanot in seedfranka-fr3-v2not in seeddynamixel-2rnot in seedflexiv-rizon4not in seedassetsnot in seedapptronik-apollonot in seedfourier-n1not in seed
Compatible environments
0No environments list Decoder-only-Transformer-Pre-training- yet.
Datasets that reference this policy
0No datasets reference Decoder-only-Transformer-Pre-training- yet.