File size: 623 Bytes
c981ab2 |
1 2 3 4 5 |
so the use of llms in daily life is increasing however only english is the language of all the base models , plus also the current state of the art llms
are created using the transformer architecture, which has proven to be the industry standard however due to its self attention mechanism
its been computationally inefficient so we are proposing Cauvery 7b , a 7 billion parameter large language model currently under development
that DOES NOT USE THE TRANSFORMER ARCHITECTURE AND AS AN ALTERNATIVE USES THE retentive network architecture with retention mechanism, we are
in our early stages and looking for investors |