niansa
|
7cd3899dd0
|
Check for correct magic value in llama
|
2023-08-31 17:57:56 +02:00 |
|
niansa
|
5d818e31aa
|
Call llama_backend_init()/llama_backend_free()
|
2023-08-31 16:56:10 +02:00 |
|
niansa
|
85eb2047cb
|
Improved llama.cpp version naming scheme
|
2023-05-20 16:53:03 +02:00 |
|
niansa
|
5feca59be7
|
Fixed linebreaks and support latest llama.cpp
|
2023-05-20 02:25:46 +02:00 |
|
niansa
|
c9dac7cb89
|
Fixed file type detection
|
2023-05-19 17:45:32 +02:00 |
|
niansa
|
4974338e41
|
Fixup step #2
|
2023-05-19 16:18:26 +02:00 |
|
niansa
|
9bf70e3f5d
|
Renamed llama-mainline to llama_old
|
2023-05-19 15:57:17 +02:00 |
|
niansa
|
b5e10d1fa3
|
Use magic to identify llama models
|
2023-05-19 02:40:57 +02:00 |
|
niansa
|
8fbbf58622
|
Removed magic_match for llama.cpp
|
2023-05-18 17:49:22 +00:00 |
|
niansa
|
60fe6b9c55
|
Load implemenations as shared objects
|
2023-05-16 19:10:05 +00:00 |
|
niansa
|
e4832f1077
|
Added GPT-J serialization/deserialization
|
2023-05-07 12:02:04 +02:00 |
|
|
7aca184dba
|
Updated llama.cpp
|
2023-04-27 08:22:51 +02:00 |
|
niansa
|
d09f892120
|
Updated llama.cpp
|
2023-04-22 15:34:58 +02:00 |
|
niansa
|
4c9a3a308b
|
Updated llama.cpp
|
2023-04-17 23:21:54 +02:00 |
|
niansa
|
f57a729853
|
Fixed savestates
|
2023-04-17 23:08:05 +02:00 |
|
niansa
|
139935adb2
|
Removed gpt2, updated python binding and added "savestates"
|
2023-04-17 22:46:45 +02:00 |
|
niansa
|
2d97e7b2bd
|
Updated llama.cpp
|
2023-04-16 18:02:29 +02:00 |
|
niansa
|
97d94ea2c9
|
Updated llama.cpp and python bindings
|
2023-04-05 19:35:41 +02:00 |
|
|
8b5a375f59
|
Updated llama.cpp
|
2023-04-03 10:15:15 +02:00 |
|
niansa
|
14410be6e5
|
Updated llama.cpp
|
2023-04-01 14:48:42 +02:00 |
|
|
aaddcc0cbd
|
Initial commit
|
2023-03-30 07:03:33 -05:00 |
|